带有 python 脚本的 Azure Blob 副本
Posted
技术标签:
【中文标题】带有 python 脚本的 Azure Blob 副本【英文标题】:Azure blob copy with python script 【发布时间】:2021-12-02 00:24:04 【问题描述】:我需要使用 python 脚本创建、上传和复制 100 个 blob 从一个帐户存储到第二个。
`
import os, uuid
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient, __version__
#connect_str1 = os.getenv('AZURE_STORAGE_CONNECTION_STRING1')
connect_str2 ="DefaultEndpointsProtocol=https;AccountName=arielstorageaccount2;AccountKey=n+Bx[...]Q==;EndpointSuffix=core.windows.net"
#blob_service_client = BlobServiceClient.from_connection_string(connect_str)
blob_service_client2 = BlobServiceClient.from_connection_string(connect_str2)
# Create a unique name for the container
#container_name = str(uuid.uuid4())
container_name2 = str(uuid.uuid4())
# Create the container
#container_client = blob_service_client.create_container(container_name)
container_client2 = blob_service_client2.create_container(container_name2)
local_path = r"C:\Users\pavel\PycharmProjects\pythonProject1\data"
os.mkdir(local_path)
# Create a file in the local data directory to upload and download
for i in range(100):
local_file_name = str(uuid.uuid4()) + ".txt"
upload_file_path = os.path.join(local_path, local_file_name)
cwd = os.getcwd()
print(cwd)
# Write text to the file
file = open(upload_file_path, 'w')
file.write("Hello, World!")
file.close()
# Create a blob client using the local file name as the name for the blob
blob_client = blob_service_client2.get_blob_client(container=container_name2, blob=local_file_name)
print("\nUploading to Azure Storage as blob:\n\t" + local_file_name)
# Upload the created file
with open(upload_file_path, "rb") as data:
blob_client.upload_blob(data)
`
我有几个问题: 1. 它确实创建了新容器但不上传 BLOBS 2. 代码没有循环 100 次 3.不明白如何将BLOB从一个帐户存储复制到另一个
【问题讨论】:
The code doesn't do the loop 100 times
- 它循环了多少次?您是否看到本地创建的文件?
1 次,是的,我在本地看到它们。
【参考方案1】:
尝试使用此代码,我在我的系统中尝试过,我尝试将所有 blob 从容器下载到本地机器,然后将它们上传到另一个存储
from azure.storage.blob import BlobServiceClient
import os
source_key = 'source_key'
source_account_name = 'sourceaccounttest1'
block_blob_service = BlobServiceClient(
account_url=f'https://source_account_name.blob.core.windows.net/', credential=source_key)
des_key = 'des_accKey'
des_account_name = 'desaccount'
des_blob_service_client = BlobServiceClient(
account_url=f'https://des_account_name.blob.core.windows.net/', credential=des_key)
#generator = block_blob_service.list_containers("testcopy")
source_container_client = block_blob_service.get_container_client(
'testcopy')
des_container_client = des_blob_service_client.get_container_client(
'testcopy')
generator =source_container_client.list_blobs("")
fp = open('local file path', 'ab')
i=1
for blob in generator:
print(blob.name)
print("HI")
path_to_file = "localfilepath"+str(i)+".txt"
blob_client = source_container_client.get_blob_client(blob.name)
with open(path_to_file, "wb") as my_blob:
blob_data = blob_client.download_blob()
blob_data.readinto(my_blob)
i=i+1
path_remove = "C:\\"
local_path = "C:\\blobs" #the local folder
i=1
for r,d,f in os.walk(local_path):
if f:
for file in f:
file_path_on_azure = os.path.join(r,file).replace(path_remove,"")
file_path_on_local = os.path.join(r,file)
file_names='doc'+str(i)
blob_client = des_container_client.get_blob_client(file_names)
i=i+1
print(blob_client.blob_name)
with open(file_path_on_local,'rb') as data:
blob_client.upload_blob(data)
输出
本地保存在 blob 文件夹中
【讨论】:
以上是关于带有 python 脚本的 Azure Blob 副本的主要内容,如果未能解决你的问题,请参考以下文章
Azure Function App Python Blob 触发器巨大的文件大小
Python:azure-storage-blob 需要 6 分钟才能将 20MB 文件下载到本地
从 Azure Function App 访问带有防火墙的 Azure Blob 存储
在 Scala 中使用带有 java.nio.channels.ClosedChannelException 的 com.azure.storage.blob 包的基本 blob 下载失败