Posts

Showing posts from November, 2025

DBeaver's Visual Query Builder

Image
 Quick! You need to create a complex query, but you don't remember how to join the tables. And you need to get it done ASAP. You are in luck if you have an enterprise edition of DBeaver handy.  You can drag and drop the tables and then adjust the clauses as needed without having to dig into the table definitions. 1. Open the database  The PostgreSQL DVD Rental database.  I am using the PostgreSQL DVDRental dataset for these examples. I have opened the connection and listed the tables. As you can see, the database contains numerous tables.  We are going to build a query visually to get information about the customers. 2. Open a SQL Console  Open SQL Console The second step requires an SQL Console. You will find it under the SQL Editor tab. 3. Select Builder  On the right side, Select Builder On the right side of the console pane, tilted 90 degrees, is a tab labeled Builder.  4. You are ready to drag and drop tables   The Query Builder Is Read...

CSV Loading Error Fix: Data Too Long Or Value Too Long

 You are trying to load a CSV file into a database, and almost inevitably, you run into an ERROR: value too long for type character varying 64 if you are using PostgreSQL.  Or SQL Error [1406] [22001]: Data truncation: Data too long for column 'Name' at row 1 if you are using MariaDB or MySQL. It doesn't matter if you are using the DBeaver application or the native database import tools. Newbies will look at the first line of the CSV file to examine the very first line and scratch their heads, as there is no error in the first line. The database is ingesting the data one line at a time, so each line in the file is the first line. Your line in the CSV file is hiding deep in the file. And there may be many of them in there.  The error message you receive is helpful, but it DOES NOT tell you which line in the CSV file is causing you this grief. You could use an 'eyeball inspection' of the data, which quickly becomes tedious. And if the CSV file has thousands of lines...

Loading The Titanic Dataset Into PostgreSQL With DBeaver Part 3

Image
 In the previous  post, we asked AI to make recommendations to help clean up the data loaded directly from a CSV file. The initial data load for the Name failed because a VARCHAR(64) was estimated to be insufficient to hold the data. But it was not, and a TEXT field was used instead.  So now we want to save some disk space!   The first obvious candidate may seem to be the Name field; a TEXT field can hold up to a gigabyte of data. None of the names is that long. If we switch to VARCHAR, we do not gain space, as both have the exact underlying mechanism in PostgreSQL.  PassengerId There are only 891 records, so we don't have to worry about growth. This is a 'fixed' list, as no one else is going to be added to this list, so we can treat the column as having a fixed width. We can ask for the current maximum value of the columns. In the 'real' world, when creating a table, you want your primary key column to be sized to allow for growth, hopefully enormous ...

Migration From MySQL To PostgreSQL In Five Steps Using DBeaver

 I wrote a post in my MySQL blog on migrating from MySQL to PostgreSQL using DBeaver . You can pass it along to your acquaintances who want to get off the Dolphin and on the Elephant.  Not only will DBeaver move your tables and data, but you can compare them afterwards. In the post, I outline the process in five steps. DBeaver will let you do it in four.  Please share this blog with your friends to promote PostgreSQL.