1
General Board / Re: ChatGPT enabled models?
« on: August 20, 2025, 09:01:18 am »
Hi All,
Indeed LLM support has been in EA since version 17.1 and it recently got a boost in build 1713.
Configuration:
Once enabled you can then have chats and discussions with the LLM, we have a video that demos this off:
https://youtu.be/vPcu3Sui2ZY
The main features are:
*A few main caveats:
- We are actively working on this feature, so dialogs and functionality will change.
- The LLM configuration is a model level config meaning a) you will need to configure it for each model you want to work on and b) if you share this model with others they will also be able to make queries to your LLM API.
Indeed LLM support has been in EA since version 17.1 and it recently got a boost in build 1713.
Configuration:
- To configure it you need to be in a model with security enabled.
- After you have logged in to the model go to Settings -> Options -> Collaboration
- In build 1713 you will see a "ChatGPT Configuration" area, this will be expanded to other LLMs in coming builds.
- Enter in the required details: Service Address, API Key, Model, Group.
- And you will be up and running.
Once enabled you can then have chats and discussions with the LLM, we have a video that demos this off:
https://youtu.be/vPcu3Sui2ZY
The main features are:
- Chats are good for one off questions with no conversation memory - good for a quick question
- Discussions have full conversation memory so you can have a discussion with the LLM
- A workbench to move the result of the LLM into a workable area (Create elements, push text to notes, create glossary items and etc)
- #diagram# and #element# keywords to send the LLM the currently active diagram or currently selected element so you can ask specific questions of the LLM
*A few main caveats:
- We are actively working on this feature, so dialogs and functionality will change.
- The LLM configuration is a model level config meaning a) you will need to configure it for each model you want to work on and b) if you share this model with others they will also be able to make queries to your LLM API.
