Skip to content

Commit d9cb000

Browse files
authored
Merge pull request Significant-Gravitas#2324 from itaihochman/iss1211
* Use BROWSE_MAX_CHUNK_LENGTH for chunking text * Fix Issue Significant-Gravitas#1211: GPT-3.5 token limit is lower than the default
2 parents d163c56 + e34ede7 commit d9cb000

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

autogpt/processing/text.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ def summarize_text(
6262
print(f"Text length: {text_length} characters")
6363

6464
summaries = []
65-
chunks = list(split_text(text))
65+
chunks = list(split_text(text, CFG.browse_chunk_max_length))
6666
scroll_ratio = 1 / len(chunks)
6767

6868
for i, chunk in enumerate(chunks):

0 commit comments

Comments
 (0)