Fibery's AI disregards part of long texts

I have this simple Summary markdown automation template, in the ‘Summary’ rich text field of of my Page database:

[ai temperature=0.5]
Generate a categorized and bullet-pointed summary in telegram style, without title, of approximately 100 words of: {{Description}}
[/ai]

More often than not, the result is a summary of only part of the Description field text.
I tested that, and it appears that the AI only sends a cut-off amount of tokens, resulting in an incorrect summary, without any notification of that limitation.

Is the limit caused by the token length of the current model Fibery uses for these AI calls?
Can I change the code to make it send and process large texts?

Sorry, I managed to create better summaries for long texts using this:

[ai temperature=0 maxTokens=300 model=gpt-4o]

Which is explained here Fibery

Case closed for now :slight_smile:

1 Like