Anyway, you need to port the old data. Google cloud sql is cool except for a big problem - sql dump imports have 0 error reporting - if it fails, it just turns red and tells you that an unknown error occurred. So here's how I made it work:
- dump the postgres stuff selecting the tables you want with a couple extra options on:
- pg_dump --column-inserts --data-only POSTGRES_DATABASE_NAME -t TABLE_NAME -t ANOTHER_TABLE_NAME -f NEW_FILE_NAME.sql [note: you need to have psql privileges already here].
- delete the top lines of the dump file created in 2) until the first "insert" line.
- load it into mysql locally, where you can catch any errors:
- mysql -u USER -p DATABASE_NAME < NEW_FILE_NAME
- dump it from the local mysql:
- mysqldump -u USERNAME -p --add-drop-table MYSQL_DATABASE_NAME TABLE_NAME ANOTHER TABLE_NAME> FIXED_SQL_DUMP_FILE.sql
- add as the first line in the new dumpfile: "use DATABASE_NAME;" (ignore the quotes, add the name of the database you want the data loaded into on google).
- Now you can load this new file into a google cloud storage bucket using their web browser gui and from there import it into cloud sql.
- pray, as you wait for the stupid thing with no error reporting to turn green.
facepalms: 7
No comments:
Post a Comment