The newest route to easier, faster writing, using AI
Add an AI-powered text editor with pre-written prompts, to your app
Embed the power of AI inside your app
Spark creativity, kill writer's block,
or improve readability
TinyMCE’s AI Assistant with pre-written prompts, helps users write better and faster. Use OpenAI’s powerful GPT models – via your own OpenAI API key – and customize them to suit your exact use case. Pre-written prompts are available out-of-the-box and in an afternoon you can be up and running with AI-driven writing tools within a familiar, intuitive UI that works within your existing app design.
- Automate tedious work
- Shorten or lengthen
- Speed up first drafts
- Kill writer’s block
- Unjumble notes
- Summarize
- Alter tone
- Simplify
Let TinyMCE's AI Assistant do hours of work, in minutes
Better AI. Useful AI. Familiar AI.
Don't just add AI to your app, make it useful
Powerful pre-written prompts
Research shows that pre-written prompts return more useful responses from generative AI applications, and are more likely to increase AI usage among beginners.
Use the default out-of-the-box prompts, engineer your own, or let users write them from scratch – the choice is yours.
Use a familiar UI for AI
Users start producing content faster, once you add a generative AI solution – that works in ways they expect – to your app.
Designed as a core part of the content creation workflow, instead of a bolt-on component, the UI works out-of-the-box without adding any extra AI development burdens – it actually reduces them.
Total flexibility
Connect to OpenAI’s GPT models using your OpenAI API key. That lets you leverage the power of AI, but also make it your own by choosing the model and adjusting parameters – like temperature and max tokens.
You can even prime the model with instructions specific to your use case, and it easily scales as your app grows.
How AI Assistant works
The user interface between your app and OpenAl
TinyMCE + AI Assistant inside your app
Server-side proxy for secure connection to OpenAl (only recommended for production)
OpenAI API
4 easy steps to get started…
Step 1
Add AI Assistant to your TinyMCE installation
Step 2
Connect to OpenAl using your OpenAl API key
Step 3
Choose UI options and customize the default prompts
Step 4
Ready for production? Set up a server-side proxy for secure communication with OpenAl
Explore Docs →
Add the AI Assistant to your
TinyMCE account
Get exclusive accessAI Assistant Frequently Asked Questions
What is the AI Assistant?
TinyMCE’s AI Assistant not only adds AI to your app, it makes it useful for your users. It lets them generate, rewrite and transform their content, using OpenAI’s powerful GPT models – the same models used in ChatGPT. You can customize the user experience and define your own prompts, all within the TinyMCE config.
How much does the AI Assistant cost?
It's exclusively available as an add-on for any paid Professional or Enterprise plan. To purchase the AI Assistant, contact our sales team.
Is there a free trial?
Yes. All new TinyMCE plans come with a FREE 14-day trial of all Premium plugins, including the AI Assistant. To get access to the AI Assistant on an existing account, fill out the form on this page.
Do I need to be on a paid TinyMCE plan to access the AI Assistant?
Yes. After the 14-day trial period expires, the AI Assistant can only be purchased as an add-on to paid TinyMCE Professional or Enterprise plans.
What Large Language Model (LLM) does the AI Assistant use?
The AI Assistant uses any of the GPT models available through OpenAI’s API. However, you need to have your own OpenAI API key. At the time of writing, these include, but are not limited to GPT-3.5 and GPT-4.
Do I need an OpenAI account and API key to make the AI Assistant work?
Yes. You are responsible for creating an OpenAI API key, and any costs associated with its usage that are paid directly to OpenAI. The AI Assistant provides a configurable, intuitive interface for OpenAI prompts, while your OpenAI API key is used to send and receive prompts and responses from OpenAI.
Do I have to pay for API requests to OpenAI?
The AI Assistant provides a configurable, intuitive interface for OpenAI prompts, while your OpenAI API key is used to send and receive prompts and responses from OpenAI. You are responsible for any usage costs associated with your OpenAI API key that are paid directly to OpenAI. These costs are separate from the fee to license the AI Assistant, which is paid to TinyMCE.
How does the AI Assistant integrate with TinyMCE?
The AI Assistant is available as a paid add-on plugin, loaded through the Tiny Cloud. You install it by adding it to your TinyMCE instance and then configuring it via the tinymce.init() function.
Does the AI Assistant work with Self-hosted TinyMCE installs?
Customers with Self-hosted installs need to purchase the AI Assistant to access a downloadable ZIP of the plugin. Existing Self-hosted customers can contact their Account Manager to get access to an AI Assistant 14-day FREE trial loaded from the Tiny Cloud.
What customization options are available with AI Assistant?
The AI Assistant comes with options to show a free-form prompt input, a list of pre-written prompts (customizable by you), or both. You can also connect to the OpenAI GPT model of your choice, and set parameters such as temperature and max tokens.
Can I use my own prompts and parameters with AI Assistant?
Yes. The AI Assistant ships with default prompts designed to suit generic content creation use cases, but it’s expected that developers will customize the prompts to their apps’ and users’ needs.
Do I need to set up a server-side proxy to get the AI Assistant working?
If you want to test out the AI Assistant in a development environment, you do not need to set up a server-side proxy. However, this method is not recommended for production due to security concerns. Refer to our documentation, which describes how to set up the server-side proxy.
How do I set up a server-side proxy for secure communication with OpenAI?
Setting up the AI Assistant for production requires you to set up a server-side proxy to manage the secure flow of data between your app and OpenAI. This connection requires you to use your own OpenAI API key, and cover any API usage costs incurred directly with OpenAI. Refer to our documentation, which describes how to set up the server-side proxy.