# Google BigQuery
[Google BigQuery](https://cloud.google.com/bigquery) is a fully-managed, serverless data warehouse that enables scalable analysis over petabytes of data.
## Terminology
**Bucket** is a container that holds your data. Everything that you store in Cloud Storage must be contained in a bucket. You can use buckets to organize your data and control access to your data, but unlike directories and folders, you cannot nest buckets.
**Project** organizes all your Google Cloud resources. All data in Cloud Storage belongs inside a project. A project consists of a set of users, APIs; and billing, authentication, and monitoring settings for those APIs.
## Establishing Connection
Google BigQuery connector uses Google Authentication and asks you to specify **Project Id**, **DataSet Id**, and **Cloud Storage Bucket**.
### Getting Credentials
* To locate your **Project Id**, go to the [Dashboard Page](https://console.cloud.google.com/home?_ga=2.139051444.64250712.1666035801-1170473586.1665474411) and look for **Project Info**. Your **Project Id** will be there.

* To locate your **Dataset Id**, go to the [BigQuery](https://console.cloud.google.com/bigquery?_ga=2.225560861.64250712.1666035801-1170473586.1665474411) page and on the **Explorer** panel, expand your project and select a dataset. **Dataset Id** will be displayed under **Dataset Info**.

* To get your **Bucket** info go to [Buckets](https://console.cloud.google.com/projectselector2/storage/browser?_ga=2.28691897.486846376.1660926329-1704442088.1660926021&supportedpurview=project) and in the bucket list, find the bucket you want to verify, look for bucket name in the column headers.
### Creating Connection
Please follow these steps, while [creating a connection](https://docs.skyvia.com/connections/#creating-connections) between Skyvia and Google BigQuery:

1. Sign in with your Google account.
2. Enter **Project Id**, **DataSet Id**, and **Cloud Storage Bucket**.
3. Click **Create Connection**.
### Additional Connection Parameters
#### Use Bulk API
Use Bulk Import defines whether Import packages with Google BigQuery set as a Target will use bulk import. It is recommended to keep this parameter turned on as bulk import significantly increases the import speed. A downside of this approach is that you do not get access to a [per-record error log](https://docs.skyvia.com/data-integration/package-run-history.html).
#### Use Legacy SQL
Use Legacy SQL enables support of Legacy SQL which allows reserved keywords in some places that standard SQL does not.
#### Command Timeout
Command Timeout value defines the wait time until error generation while trying to execute a command. Command Timeout doesn't affect the wait time for data fetching.
## Connector Specifics
Since Google BigQuery does not have primary or unique keys, Skyvia has the following limitations for Google BigQuery:
* [Synchronization](https://docs.skyvia.com/data-integration/synchronization/) is not supported for Google BigQuery.
* UPSERT operation in Import is not supported for Google BigQuery.
* When performing import with the UPDATE or DELETE operation, you need to manually specify columns, which will be considered a primary key.
## Supported Actions
Skyvia supports all the [common actions](https://docs.skyvia.com/data-integration/actions.html#common-actions) for Google BigQuery.