previous information summary
Today, I wanted to write a multi-process Python script to upload code to the server, so I tested it locally with a virtual machine, but it always reported an error. The specific error information is as follows
Traceback (most recent call last):
File "D:\python3.6.7\lib\multiprocessing\process.py", line 258, in _bootstrap
self.run()
File "D:\python3.6.7\lib\multiprocessing\process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "D:\Documents\education-server\fabfile.py", line 88, in upload
sftp.put(local_path, target_path, confirm=True)
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 759, in put
return self.putfo(fl, remotepath, file_size, callback, confirm)
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 717, in putfo
reader=fl, writer=fr, file_size=file_size, callback=callback
File "D:\python3.6.7\lib\site-packages\paramiko\util.py", line 301, in __exit__
self.close()
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_file.py", line 82, in close
self._close(async_=False)
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_file.py", line 104, in _close
self.sftp._request(CMD_CLOSE, self.handle)
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 813, in _request
return self._read_response(num)
File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 843, in _read_response
t, data = self._read_packet()
File "D:\python3.6.7\lib\site-packages\paramiko\sftp.py", line 205, in _read_packet
raise SFTPError("Garbage packet received")
paramiko.sftp.SFTPError: Garbage packet received
I searched the Internet for a long time and didn’t find the answer. I didn’t remember until I saw this. It seems that my virtual machine Linux has set up a time synchronization background process in ~ /. Bashrc. Each time I enter the Linux terminal, I will synchronize the time
so I commented out this configuration and run it again
can’t pickle_thread.lock objects
Another problem is that the parameters of multiple processes cannot be user-defined objects, otherwise the following errors will occur
... ... ...
TypeError: can't pickle _thread.lock objects
The reason for this problem is that I passed the user-defined object parameters in the multi process running function. Just write the user-defined object into the function
before modification
p1 = Process(target=ssh_obj.upload, args=("192.168.129.10", "admin", "aa.jar", "/root/aa.jar"))
After modification
p1 = Process(target=upload, args=("192.168.129.10", "admin", "aa.jar", "/root/aa.jar")) # 重写一个函数,将对象放入函数中
Similar Posts:
- [Solved] The paramiko module failed to upload the file: paramiko.ssh_exception.SSHException: Channel closed.
- Python AttributeError: ‘unicode’ object has no attribute ‘tzinfo’
- [Solved] Django error: AttributeError: ‘QuerySet’ object has no attribute ‘id’
- SSHException: Error reading SSH protocol banner
- The problem of requirementparseerror in using Python paramiko package
- An error occurs when using PIP install XX command
- Django startup error: generator expression must be parentized
- [Solved] UnicodeDecodeError: ‘gbk’ codec can’t decode byte 0xa5 in position 1508: illegal multibyte sequence
- Using Python HDFS module to operate Hadoop HDFS
- Node uses SSH2 SFTP Client to upload and download FTP files