Skip to content

Commit 5f7d969

Browse files
committed
Implement caching for configuration and personality prompt retrieval
In an effort to improve efficiency and reduce unnecessary disk reads, caching has been implemented for the retrieval of configuration files and personality prompts. The functools.lru_cache decorator has been employed to this end, ensuring that repeated calls to these functions will return cached results when available. This change should result in a noticeable performance improvement, particularly in scenarios where these functions are called frequently.
1 parent f800921 commit 5f7d969

File tree

2 files changed

+5
-2
lines changed

2 files changed

+5
-2
lines changed

aicodebot/config.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
from aicodebot.helpers import logger
22
from pathlib import Path
3-
import os, yaml
3+
import functools, os, yaml
44

55

6+
@functools.lru_cache
67
def get_config_file():
78
if "AICODEBOT_CONFIG_FILE" in os.environ:
89
config_file = Path(os.getenv("AICODEBOT_CONFIG_FILE"))
@@ -12,6 +13,7 @@ def get_config_file():
1213
return config_file
1314

1415

16+
@functools.lru_cache
1517
def read_config():
1618
"""Read the config file and return its contents as a dictionary."""
1719
config_file = get_config_file()

aicodebot/prompts.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
from langchain import PromptTemplate
55
from pathlib import Path
66
from types import SimpleNamespace
7-
import os
7+
import functools, os
88

99
# ---------------------------------------------------------------------------- #
1010
# Personality helpers #
@@ -97,6 +97,7 @@
9797
DEFAULT_PERSONALITY = PERSONALITIES["Spock"]
9898

9999

100+
@functools.lru_cache
100101
def get_personality_prompt():
101102
"""Generates a prompt for the sidekick personality."""
102103
default_personality = DEFAULT_PERSONALITY.name

0 commit comments

Comments
 (0)