DataFrame.to_gbq(destination_table, project_id, chunksize=None, verbose=None, reauth=False, if_exists='fail', private_key=None, auth_local_webserver=False, table_schema=None)
[source]
Write a DataFrame to a Google BigQuery table.
This function requires the pandas-gbq package.
Authentication to the Google BigQuery service is via OAuth 2.0.
private_key
is provided, the library loads the JSON service account credentials and uses those to authenticate.private_key
is provided, the library tries application default credentials.Parameters: |
destination_table : str Name of table to be written, in the form ‘dataset.tablename’. project_id : str Google BigQuery Account project ID. chunksize : int, optional Number of rows to be inserted in each chunk from the dataframe. Set to reauth : bool, default False Force Google BigQuery to reauthenticate the user. This is useful if multiple accounts are used. if_exists : str, default ‘fail’ Behavior when the destination table exists. Value can be one of:
private_key : str, optional Service account private key in JSON format. Can be file path or string contents. This is useful for remote server authentication (eg. Jupyter/IPython notebook on remote host). auth_local_webserver : bool, default False Use the local webserver flow instead of the console flow when getting user credentials. New in version 0.2.0 of pandas-gbq. table_schema : list of dicts, optional List of BigQuery table fields to which according DataFrame columns conform to, e.g. New in version 0.3.1 of pandas-gbq. verbose : boolean, deprecated Deprecated in Pandas-GBQ 0.4.0. Use the logging module to adjust verbosity instead. |
---|
See also
pandas_gbq.to_gbq
pandas.read_gbq
© 2008–2012, AQR Capital Management, LLC, Lambda Foundry, Inc. and PyData Development Team
Licensed under the 3-clause BSD License.
http://pandas.pydata.org/pandas-docs/version/0.23.4/generated/pandas.DataFrame.to_gbq.html