本文介绍了Python子进程,mysqldump和管道的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

尝试构建简单的备份/升级数据库脚本时遇到问题.

I've got a problem trying to build a easy backup/upgrade database script.

错误发生在使用子进程的mysqldump调用中:

The error is in the mysqldump call using subprocess:

cmdL = ["mysqldump", "--user=" + db_user, "--password=" + db_pass, domaindb + "|", "gzip", ">", databases_path + "/" + domaindb + ".sql.gz"]
print "%s: backup database %s \n\t[%s]" % (domain, domaindb, ' '.join(cmdL))
total_log.write("%s: backup database %s \n\t[%s] \n" % (domain, domaindb, ' '.join(cmdL)))
p = subprocess.Popen(cmdL, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

在开始之前,我将sys.stdoutsys.stderr重定向到文件,以便具有日志系统.

Before that cose, i redirect sys.stdout and sys.stderr to files, in order to have a log system.

在那些日志中,我发现了错误:

In those log, i find the error:

[mysqldump --user = xxxxxx --password = yyyyyyyy database_name | gzip>/home/drush-backup/2010-08-30.15.37/db/database_name.sql][错误]:mysqldump:找不到表:"|"

[mysqldump --user=xxxxxx --password=yyyyyyyy database_name | gzip > /home/drush-backup/2010-08-30.15.37/db/database_name.sql][Error]: mysqldump: Couldn't find table: "|"

似乎|字符被视为mysqldump参数,而不是管道.

Seem that the | character is seen as an mysqldump arguments, instead as a pipe.

查看python子进程文档,这很正常,但是我如何获取所需的信息(调用命令mysqldump --user=xxxxxx --password=yyyyyyyy database_name | gzip > /home/drush-backup/2010-08-30.15.37/db/database_name.sql)?

Looking into the python subprocess documentation, this is normal, but how can i obtain what i need (call the command mysqldump --user=xxxxxx --password=yyyyyyyy database_name | gzip > /home/drush-backup/2010-08-30.15.37/db/database_name.sql)?

编辑,我只是在python文档上看到以下示例:

EDIT I just see this example on python docs:

output=`dmesg | grep hda`
==>
p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]

并且我已经编辑了我的脚本:

and i've edit my script:

command = ["mysqldump", "--user=" + db_user, "--password=" + db_pass, domaindb, "|", "gzip", ">", databases_path + "/" + domaindb + ".sql.gz"]
cmdL1 = ["mysqldump", "--user=" + db_user, "--password=" + db_pass, domaindb]
cmdL2 = ["gzip", ">", databases_path + "/" + domaindb + ".sql.gz"]

print "%s: backup database %s \n\t[%s]" % (domain, domaindb, ' '.join(command))
total_log.write("%s: backup database %s \n\t[%s] \n" % (domain, domaindb, ' '.join(command)))

p1 = subprocess.Popen(cmdL1, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
p2 = subprocess.Popen(cmdL2, stdin=p1.stdout, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
cmdError, cmdData = p2.communicate()

现在命令变量只是为了方便日志使用.

now the command variable is just used for convenience in logs.

下一步,但它在>流中停止,出现以下错误:

This go a step next but it stops in the > stream, with this error:

[Error]: gzip: >: No such file or directory
gzip: /path/to/backups/dir/natabase_name.sql.gz: No such file or directory

很显然,如果我在终端中尝试该命令,它将起作用.

Obviously, if i try the command in a terminal it works.

推荐答案

我不确定管道的解释方式.如果有问题,可以以编程方式创建pipelilne.

I'm not sure of how the pipe will get interpreted. If that's a problem, you can programatically create a pipelilne.

来自: http://docs.python.org/library/subprocess.html#替换壳管道

p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]

编辑

对于文件重定向,您可以将stdout定向到文件.

As for file redirection, you can direct stdout to a file..

示例:

out_file = open(out_filename, "wb")
gzip_proc = subprocess.Popen("gzip", stdout=out_file)
gzip_proc.communicate()

或者,如果您接受Alex的建议并使用Python的标准库 gzip 模块,可以做这样的事情:

or if you take Alex's advice and use Python's standard library gzip module, you can do something like this:

import gzip
import subprocess

...
#out_filename = path to gzip file

cmdL1 = ["mysqldump", "--user=" + db_user, "--password=" + db_pass, domaindb]
p1 = subprocess.Popen(cmdL1, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
dump_output = p1.communicate()[0]

f = gzip.open(out_filename, "wb")
f.write(dump_output)
f.close()

这篇关于Python子进程,mysqldump和管道的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

05-28 11:01
查看更多