LoginSignup
0
2

More than 3 years have passed since last update.

python bigquery and MySQLからpandas dataframe作成。dataframeから bigquery and Google Cloud Storage as csv

Last updated at Posted at 2019-06-25

内容

bigquery, MySQLからpandas dataframeを作る関数と、dataframeからBigqueryへ保存とGCSへのCSV保存をやりたく、ついにまとめたので、ちと記事作成。

コード

sample.py
import pandas as pd
import os
import pymysql
import dsclient

# init client with project id and creadential
client = dsclient.Client(
    "YOUR_GCP_PROJECT_NAME",
    "/hogehoge/hogehoge/credentials.json"
)


def bq2df(query):
    # read data with big query
    return client.query(query)


def mysql2df(query):
    # read data from MySQL with Google Cloud SQL
    connection = pymysql.connect(
        host='XXXXXXX',
        user='YOUR_USERNAME',
        password='YOUR_PASSWORD',
        db='YOUR_DBNAME',
        charset='utf8',
        cursorclass=pymysql.cursors.DictCursor
    )
    return pd.read_sql(sql, connection)


def df2bq(df, dataset, table):
    # store data with big query
    tb_name = dataset + "." + table
    client.load(df, tb_name)


def df2gcs(df, folder, file_name, bucket="gs://hogehoge/"):
    # store data to Google Cloud Storage as csv
    buket_name = os.path.join(bucket, folder, file_name)
    client.write_csv(df, buket_name)


if __name__ == "__main__":
    df = pd.DataFrame([i for i in range(100)], columns=["test"])
    folder = "write_test"
    file_name = "test1.csv"
    df2gcs(df, folder, file_name)

英語のコメントアウトは気にしないでちょ
各hogehoge等を変更してガムバッテクレ
これ超便利。

参考サイト

0
2
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
2