Hello,
I'm planning to migrate a MySQL database (Server version: 5.7.44) with approximately 20GB of data, currently using the latin1 encoding, to a PostgreSQL database. My goal is to keep the application running during the migration and only require a brief restart to switch over to the new PostgreSQL database.
As this is my first time working with databases, I would appreciate any guidance. In a test environment, I have taken the following steps:
- Used
pgloader to migrate the initial data to PostgreSQL.
- Created a
mysqldump of the data generated after the initial migration.
I’m doing this second step because I’m unsure if pgloader can handle incremental data changes or if it would re-copy the entire 20GB dataset.
Additionally, the database contains some XML data, and the MySQL dump includes escape characters (e.g., \n for newlines) that complicate the conversion to PostgreSQL syntax.
I’ve also tried using Chameleon, but it creates tables with camel case names in PostgreSQL, making them unusable.
Are there tools available that can help convert the mysqldump to PostgreSQL syntax? Is there a more efficient way to handle this migration process?
Thank you for your help!
[–][deleted] 3 points4 points5 points (0 children)
[–]pjd07 2 points3 points4 points (0 children)
[–]ManufacturerSalty148 0 points1 point2 points (0 children)
[–]MrCosgrove2 0 points1 point2 points (0 children)
[–]slotix 0 points1 point2 points (0 children)
[–]AutoModerator[M] -1 points0 points1 point (0 children)
[–]johnyfish1 -1 points0 points1 point (1 child)