33This library was developed for the Kaggle challenge:
44[ ** Google - Unlocking Global Communication with Gemma** ] ( https://www.kaggle.com/competitions/gemma-language-tuning ) , sponsored by Google.
55
6- ## Credit Requirement
6+ ### Credit Requirement
77
88** Important:** If you are a participant in the competition and wish to use this source code in your submission,
99you must clearly credit the original author before the competition's end date, ** January 14, 2025** .
@@ -17,7 +17,7 @@ GitHub: [https://github.com/thewebscraping/gemma-template/](https://github.com/t
1717LinkedIn: [https://www.linkedin.com/in/thetwofarm](https://www.linkedin.com/in/thetwofarm)
1818```
1919
20- # Overview
20+ ## Overview
2121
2222Gemma Template is a lightweight and efficient Python library for generating templates to fine-tune models and craft prompts.
2323Designed for flexibility, it seamlessly supports Gemma, LLaMA, and other language frameworks, offering fast, user-friendly customization.
@@ -35,64 +35,62 @@ As a newbie, I created Gemma Template based on what I read and learned from the
3535
3636Gemma Template supports exporting dataset files in three formats: ` Text ` , ` Alpaca ` , and ` OpenAI ` .
3737
38- # Multilingual Content Writing Assistant
38+ ## Multilingual Content Writing Assistant
3939
4040This writing assistant is a multilingual professional writer specializing in crafting structured, engaging, and SEO-optimized content.
4141It enhances text readability, aligns with linguistic nuances, and preserves original context across various languages.
4242
4343---
4444
45- ## Key Features:
46- #### 1. ** Creative and Engaging Rewrites**
45+ ### Key Features:
46+ ### 1. ** Creative and Engaging Rewrites**
4747- Transforms input text into captivating and reader-friendly content.
4848- Utilizes vivid imagery and descriptive language to enhance engagement.
4949
50- #### 2. ** Advanced Text Analysis**
50+ ### 2. ** Advanced Text Analysis**
5151- Processes text with unigrams, bigrams, and trigrams to understand linguistic patterns.
5252- Ensures language-specific nuances and cultural integrity are preserved.
5353
54- #### 3. ** SEO-Optimized Responses**
54+ ### 3. ** SEO-Optimized Responses**
5555- Incorporates keywords naturally to improve search engine visibility.
5656- Aligns rewritten content with SEO best practices for discoverability.
5757
58- #### 4. ** Professional and Multilingual Expertise**
58+ ### 4. ** Professional and Multilingual Expertise**
5959- Full support for creating templates in local languages.
6060- Supports multiple languages with advanced prompting techniques.
6161- Vocabulary and grammar enhancement with unigrams, bigrams, and trigrams instruction template.
6262- Supports hidden mask input text. Adapts tone and style to maintain professionalism and clarity.
6363- Full documentation with easy configuration prompts and examples.
6464
65- #### 5. ** Customize Advanced Response Structure and Dataset Format**
65+ ### 5. ** Customize Advanced Response Structure and Dataset Format**
6666- Supports advanced response structure format customization.
6767- Compatible with other models such as LLaMa.
6868- Enhances dynamic prompts using Round-Robin loops.
6969- Outputs multiple formats such as Text, Alpaca and OpenAI.
7070
71- ** Installation**
72- ----------------
71+ ## ** Installation**
7372
7473To install the library, you can choose between two methods:
7574
76- #### ** 1\. Install via PyPI:**
75+ ### ** 1\. Install via PyPI:**
7776
7877``` shell
7978pip install gemma-template
8079```
8180
82- #### ** 2\. Install via GitHub Repository:**
81+ ### ** 2\. Install via GitHub Repository:**
8382
8483``` shell
8584pip install git+https://github.com/thewebscraping/gemma-template.git
8685```
8786
88- ** Quick Start**
89- ----------------
87+ ## ** Quickstart**
9088Start using Gemma Template with just a few lines of code:
9189
92- ## Load Dataset
90+ ### Load Dataset
9391Returns: A Hugging Face Dataset or DatasetDict object containing the processed prompts.
9492
95- ** Load Dataset from data dict**
93+ #### ** Load Dataset from data dict**
9694``` python
9795from gemma_template import gemma_template
9896
@@ -112,7 +110,8 @@ dataset = gemma_template.load_dataset(data_dict, output_format='text') # enum:
112110print (dataset[' text' ][0 ])
113111```
114112
115- ** Load Dataset from local file path or HuggingFace dataset**
113+ #### ** Load Dataset from local file path or HuggingFace dataset**
114+
116115``` python
117116from gemma_template import gemma_template
118117
@@ -133,7 +132,7 @@ dataset = gemma_template.load_dataset(
133132)
134133```
135134
136- ## Fully Customized Template
135+ ### Fully Customized Template
137136
138137``` python
139138from gemma_template import Template, FieldPosition, INPUT_TEMPLATE , OUTPUT_TEMPLATE , INSTRUCTION_TEMPLATE , PROMPT_TEMPLATE
@@ -169,7 +168,7 @@ response = template_instance.apply_template(
169168print (response)
170169```
171170
172- ### Output:
171+ #### Output
173172
174173``` text
175174<start_of_turn>user
0 commit comments