使用Python API从Google Cloud Datalab将文件上传到

我正在尝试使用Python API将文件从笔记本本身内的Datalab实例上传到我的Google存储桶,但我无法弄清楚。Google在其文档中提供的代码示例似乎不适用于Datalab。我目前正在使用gsutil命令,但想了解如何在使用Python API时执行此操作。


文件目录(我想上传位于检查点文件夹中的python文件):


!ls -R


.:

checkpoints  README.md  tpot_model.ipynb


./checkpoints:

pipeline_2020.02.29_00-22-17.py  pipeline_2020.02.29_06-33-25.py

pipeline_2020.02.29_00-58-04.py  pipeline_2020.02.29_07-13-35.py

pipeline_2020.02.29_02-00-52.py  pipeline_2020.02.29_08-45-23.py

pipeline_2020.02.29_02-31-57.py  pipeline_2020.02.29_09-16-41.py

pipeline_2020.02.29_03-02-51.py  pipeline_2020.02.29_11-13-00.py

pipeline_2020.02.29_05-01-17.py

当前代码:


import google.datalab.storage as storage

from pathlib import Path


bucket = storage.Bucket('machine_learning_data_bucket')



for file in Path('').rglob('*.py'):

    # API CODE GOES HERE

当前工作解决方案:


!gsutil cp checkpoints/*.py gs://machine_learning_data_bucket


米琪卡哇伊
浏览 104回答 1
1回答

蓝山帝景

这是为我工作的代码:from google.cloud import storagefrom pathlib import Pathstorage_client = storage.Client()bucket = storage_client.bucket('bucket')for file in Path('/home/jupyter/folder').rglob('*.py'):    blob = bucket.blob(file.name)    blob.upload_from_filename(str(file))    print("File {} uploaded to {}.".format(file.name,bucket.name))输出:File file1.py uploaded to bucket.File file2.py uploaded to bucket.File file3.py uploaded to bucket.编辑或者,您可以使用:import google.datalab.storage as storagefrom pathlib import Pathbucket = storage.Bucket('bucket')for file in Path('/home/jupyter/folder').rglob('*.py'):    blob = bucket.object(file.name)    blob.write_stream(file.read_text(), 'text/plain')    print("File {} uploaded to {}.".format(file.name,bucket.name))输出:File file1.py uploaded to bucket.File file2.py uploaded to bucket.File file3.py uploaded to bucket.
打开App,查看更多内容
随时随地看视频慕课网APP

相关分类

Python