我需要在 python 中运行一个 bigquery 脚本,它需要在谷歌云存储中输出为 CSV。目前,我的脚本触发大查询代码并直接保存到我的电脑。
但是,我需要让它在 Airflow 中运行,所以我不能有任何本地依赖项。
我当前的脚本将输出保存到我的本地机器,然后我必须将它移动到 GCS。上网查了一下,搞不明白。(ps我对python很陌生,所以如果之前有人问过这个问题,我提前道歉!)
import pandas as pd
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
def run_script():
df = pd.read_gbq('SELECT * FROM `table/veiw` LIMIT 15000',
project_id='PROJECT',
dialect='standard'
)
df.to_csv('XXX.csv', index=False)
def copy_to_gcs(filename, bucket, destination_filename):
credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)
body = {'name': destination_filename}
req = service.objects().insert(bucket=bucket,body=body, media_body=filename)
resp = req.execute()
current_date = datetime.date.today()
filename = (r"C:\Users\LOCALDRIVE\ETC\ETC\ETC.csv")
bucket = 'My GCS BUCKET'
str_prefix_datetime = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
destfile = 'XXX' + str_prefix_datetime + '.csv'
print('')
```
Smart猫小萌
相关分类