问题描述
有一种简单的方法可以将大型(100G)Google云端数据库迁移到Google数据存储中?想到的方法是编写一个python appengine每个数据库和表的脚本,然后将其放入数据存储区。这听起来很乏味,但也许它必须完成?
注意,我离开云端sql的原因是因为我有jsp页面,在它们上面有多个查询,他们即使使用d32 sql实例也非常慢。我希望将它放入数据存储区会更快?
似乎有很多关于从数据存储转移到cloud sql的问题,但是我无法找到这个。
谢谢
:
- 编写一个App引擎mapreduce [1]程序,用于从Cloud SQL以适当的块抽取数据,并写入Datastore。 >
- 在Google Compute Engine上分配一个虚拟机,并编写一个程序,用于从Cloud SQL获取数据并使用数据存储外部API写入数据存储[2]。
- 使用数据存储还原[3]。我对格式不熟悉,所以我不知道有多少工作才能获得恢复将接受的内容。
[1]
[2]
[3]
Is there an easy way to migrate a large (100G) Google cloud sql database to Google datastore?
The way that comes to mind is to write a python appengine script for each database and table and then put it into the datastore. That sounds tedious but maybe it has to be done?
Side note, the reason I'm leaving cloud sql is because I have jsp pages with multiple queries on them and they are incredibly slow even with a d32 sql instance. I hope that putting it in the datastore will be faster?
There seems to be a ton of questions about moving away from the datastore to cloud sql, but I couldn't find this one.
Thanks
Here are a few options:
- Write an App Engine mapreduce [1] program that pulls data in appropriate chunks from Cloud SQL and write is to Datastore .
- Spin up a VM on Google Compute Engine and write a program that fetches the data from Cloud SQL and write to Datastore using the Datastore external API [2].
- Use the Datastore restore [3]. I'm not familiar with the format so I don't know how much work is to get produce something that the restore will accept.
[1] https://cloud.google.com/appengine/docs/python/dataprocessing/
[2] https://cloud.google.com/datastore/docs/apis/overview
[3] https://cloud.google.com/appengine/docs/adminconsole/datastoreadmin?csw=1#restoring_data
这篇关于从Google云端迁移到数据存储的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!