# How to write a TaskTemplate Plugin 1. Make a copy of [an example plugin](https://github.com/flyteorg/flytekit/tree/master/plugins/flytekit-sqlalchemy) and rename it to `flytekit-<newPlugin>` 1. Signup for Snowflake account 1. Download & Install SnowSql CLI 1. Test out: ```bash export SNFLK_ACCOUNTNAME='<your subdomain>' export SNFLK_USERNAME='<your username>' export SNOWSQL_PWD='<your password>' export SNFLK_DBNAME=SNOWFLAKE_SAMPLE_DATA export SNFLK_SCHEMANAME=TPCH_SF001 export SNFLK_WAREHOUSENAME=COMPUTE_WH snowsql -a "${SNFLK_ACCOUNTNAME}" -u "${SNFLK_USERNAME}" -d "${SNFLK_DBNAME}" -s "${SNFLK_SCHEMANAME}" -w "${SNFLK_WAREHOUSENAME}" -o exit_on_error=True -q "SELECT * FROM CUSTOMER;" ``` 1. Create a storage integration 1. Follow [snowflake guide](https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration.html). 1. Run: ```sql USE ROLE ACCOUNTADMIN; CREATE OR REPLACE STORAGE INTEGRATION s3_customer TYPE = EXTERNAL_STAGE STORAGE_PROVIDER = S3 ENABLED = TRUE STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::757267728559:role/snowflake' storage_allowed_locations = ('s3://katrina2/'); ``` 1. Run: ```sql DESC INTEGRATION s3_customer; ``` 1. Create a table under `DEMO_DB`.. let's call it `sample` with a single column `c1`. 1. Populate it with some data ```sql use schema DEMO_DB.PUBLIC; desc table "DEMO_DB"."PUBLIC"."SAMPLE"; INSERT INTO "DEMO_DB"."PUBLIC"."SAMPLE" VALUES('hello'),('world'); SELECT * FROM "DEMO_DB"."PUBLIC"."SAMPLE"; ``` 1. Make sure we can retrieve the data from the sample table: ```bash export SNFLK_ACCOUNTNAME='<your subdomain>' export SNFLK_USERNAME='<your username>' export SNOWSQL_PWD='<your password>' export SNFLK_DBNAME=DEMO_DB export SNFLK_SCHEMANAME=PUBLIC export SNFLK_WAREHOUSENAME=COMPUTE_WH snowsql -a "${SNFLK_ACCOUNTNAME}" -u "${SNFLK_USERNAME}" -d "${SNFLK_DBNAME}" -s "${SNFLK_SCHEMANAME}" -w "${SNFLK_WAREHOUSENAME}" -o exit_on_error=True -q 'USE ROLE ACCOUNTADMIN; SELECT * FROM "DEMO_DB"."PUBLIC"."SAMPLE";' -o log_level=DEBUG ``` 1. Create s3 stage: ```sql USE ROLE ACCOUNTADMIN; USE SCHEMA DEMO_DB.PUBLIC; create stage s3_stage storage_integration = s3_customer url = 's3://katrina2/'; ``` 1. Populate s3 stage with data from the sample table: ```bash export SNFLK_ACCOUNTNAME='<your subdomain>' export SNFLK_USERNAME='<your username>' export SNOWSQL_PWD='<your password>' export SNFLK_DBNAME=DEMO_DB export SNFLK_SCHEMANAME=PUBLIC export SNFLK_WAREHOUSENAME=COMPUTE_WH snowsql -a "${SNFLK_ACCOUNTNAME}" -u "${SNFLK_USERNAME}" -d "${SNFLK_DBNAME}" -s "${SNFLK_SCHEMANAME}" -w "${SNFLK_WAREHOUSENAME}" -o exit_on_error=True -q 'USE ROLE ACCOUNTADMIN; COPY INTO @s3_stage/sample2 FROM (SELECT * FROM "DEMO_DB"."PUBLIC"."SAMPLE") file_format = (compression = none);' -o log_level=DEBUG ``` 1. Develop plugin 1. Test: 1. pip install ```bash pip install git+https://github.com/flyteorg/flytekit.git@53dfaf30dcf3d1f60a492ddf585b2037c0be789e#egg=pkg&subdirectory=plugins/flytekit-snowflake ```