Data Security and AI - Sharing Your PostgreSQL Database With Ollama AI
You probably saw the story about your public ChatGPT queries being indexed by Google and other search engines. Yikes!
You may want AI with your data but do not wish your prompts to end up in a search engine, especially if you are working with sensitive data.
Previously, we covered using some of DBeaver's Artificial Intelligence Assistant features with Ollama. Ollama is popular because it runs on your system, keeping your data from travelling over networks to an AI and back. What happens only on your system stays on your system, to paraphrase the Las Vegas motto. Paranoia in defense of your data is a good thing.
Secure By Default
We will start with an example of preference settings for DBeaver Enterprise Edition for Ollama. DBeaver has other security options I want to bring to your attention when working with an AI. Ollama differs from most other AIs because you do not have to obtain a key to work with it.
![]() |
AI Configuration - DBeaver Enterprise Edition |
The ability to tailor what you send to the AI should be your next step. You can also configure what to share with the AI. The default is to share only the metadata—tables, column names, and Views—but not the data. Optionally, sharing information about keys and object descriptions can improve the AI's job, at the expense of extra token consumption.
![]() |
Send database structure options in this example, including foreign, unique, and primary keys. |
Note that you can send table data to the AI to help it develop a better solution. This is the only option, and you can make the choice. If you do not want to share, you do not have to. The query will also run only on the local machine.
Setting The Scope
You can define the scope of database objects that the AI should consider for generating SQL queries and providing assistance. This scope enables the AI to focus on relevant parts of the database, thereby improving accuracy and efficiency, particularly in large schemas. This is particularly handy when you have two similar schemas or databases, and only one of them is intended for the work.
Memory
Ollama will not impress you with its performance until you have 32GB of memory. My original attempts with 16GB were slow and disappointing. Remember that you are running Ollama and DBeaver at a minimum and maybe a local instance, all of which use memory. I suggest you not attempt using Ollama until you have at least 32GB.
Example
We ask Ollama to create a table on a local PostgreSQL instance
![]() |
We ask Ollama to create a contact list table |
![]() |
Ollama responds with the DDL for the requested table |
![]() |
Now we request some test data |
![]() |
The table and data from OllamaConclusionDBeaver allows you to use an AI to work without worrying that your prompts will end up on a search engine. |
Comments
Post a Comment