# Introducing Dokku ([@tei-k](https://github.com/tei-k))
---

---
## What is?
- http://dokku.viewdocs.io/dokku/
- The smallest PaaS implementation you've ever seen
- Mini [Heroku](https://devcenter.heroku.com/articles/how-heroku-works)
- [Bash](https://github.com/dokku/dokku/blob/master/dokku)
---
## Requirments
- Ubuntu16/18, Debian9+, CentOS7, ...
- Memory >= 1GB
---
## Install
:::info
```
$ wget https://raw.githubusercontent.com/dokku/dokku/v0.20.4/bootstrap.sh;
sudo DOKKU_TAG=v0.20.4 bash bootstrap.sh
```
:::
---
## VirtualBox
:::info
```
$ brew cask install virtualbox
$ brew cask install vagrant
$ vagrant init bento/ubuntu-18.04
$ cat Vagrantfile # Fix ip(192.168.33.10) and memory(4096)
$ vagrant up
$ vagrant ssh
```
:::
---
## Setup SSH key
When done with the installation, go to your server’s IP and follow the web installer to configure Dokku.

---
## Add public key
:::info
```
$ dokku ssh-keys:add <name> [/path/to/key]
```
:::
---
## Plugins
https://github.com/dokku/dokku/blob/master/docs/community/plugins.md
:::info
```
$ sudo dokku plugin:install https://github.com/dokku/dokku-postgres.git
```
:::
---
## Deploy tutorial
#### Create database
:::info
```
$ dokku postgres:create railsdatabase
```
:::
---
#### Create app
:::info
```
$ dokku apps:create ruby-getting-started
```
:::
---
#### Link db and app
:::info
```
$ dokku postgres:link railsdatabase ruby-getting-started
$ dokku config ruby-getting-started
=====> ruby-getting-started env vars
DATABASE_URL: postgres://postgres:d6f85ff44b5e89381146b17fe2de249a@dokku-postgres-railsdatabase:5432/railsdatabase
DOKKU_APP_RESTORE: 1
DOKKU_APP_TYPE: herokuish
DOKKU_PROXY_PORT: 80
DOKKU_PROXY_PORT_MAP: http:80:5000
GIT_REV: b10d10fe37e35ca444992009b7eb6567599c87a0
```
:::
---
#### Clone source
:::info
```
$ git clone https://github.com/heroku/ruby-getting-started
$ git remote add dokku dokku@192.168.33.10:ruby-getting-started
```
:::
---
#### Deploy
:::info
```
$ git push dokku master
```
:::
---
#### http://${ip} :tada:

---
## dokku client on mac
:::info
```
$ brew install dokku/repo/dokku
$ export DOKKU_HOST=192.168.33.10
$ dokku apps:list
This is not a git repository
=====> My Apps
ruby-getting-started
```
:::
---
## Example 1: Jenkins
---
#### Install
:::info
```
$ dokku apps:create jenkins
$ docker pull jenkins/jenkins:2.234-alpine
$ docker tag jenkins/jenkins:2.234-alpine dokku/jenkins:2.234-alpine
$ dokku tags:deploy jenkins 2.234-alpine
```
:::
---
#### Storage mount
:::info
```
$ mkdir /var/jenkins_home
$ chmod 755 /var/jenkins_home
$ dokku storage:mount jenkins /var/jenkins_home:/var/jenkins_home
```
:::
---
#### Proxy map
:::info
```
$ dokku proxy:ports-add jenkins http:80:8080
```
:::
---
#### Domain
:::info
$ dokku domains:set-global 192.168.33.10.xip.io
$ dokku domains:add jenkins jenkins.dokku.192.168.33.10.xip.io
:::
---
http://jenkins.dokku.192.168.33.10.xip.io

---
## Example 2: Rundeck
---
#### MySQL
:::info
```
$ dokku plugin:install https://github.com/dokku/dokku-mysql.git mysql
$ dokku mysql:create rundeckdb
```
:::
---
#### Create app
:::info
```
$ dokku apps:create rundeck
$ dokku mysql:link rundeckdb rundeck
```
:::
---
#### Deploy app
:::info
```
$ git clone git@github.com:jjethwa/rundeck.git
$ cd rundeck
$ git remote add dokku dokku@192.168.33.10:rundeck
$ git push dokku master
```
:::
---
http://rundeck.dokku.192.168.33.10.xip.io

---
## Example 3: Fluentd
---
#### Dockerfile
:::info
```
$ cat Docerfile
FROM ruby:2.5-alpine
RUN apk add --no-cache g++ make curl bash mariadb-dev tzdata
RUN mkdir /app
COPY . /app/
WORKDIR /app
RUN bundle install
ENV FLUENTD_CONF="td-agent.rb"
EXPOSE 24224 5000
CMD ["sh", "-c", "fluentd -c ./config/${APP_PROFILE}/${FLUENTD_CONF}"]
```
:::
---
### Gemfile
:::info
```
source 'http://rubygems.org'
gem 'fluentd', '~> 1.10.0'
gem 'fluent-plugin-bigquery', '~> 2.2.0'
gem 'fluent-plugin-add', '~> 0.0.7'
gem 'fluent-plugin-s3', '~> 1.3.0'
```
:::
---
#### td-agent.rb
:::info
```
## var
app_name = ENV['APP_NAME'] || 'test'
app_env = ENV['APP_ENV'] || 'staging'
source_type = ENV['SOURCE_TYPE'] || 'forward'
source_port = ENV['SOURCE_PORT'] || 24224
log_dir = ENV['LOG_DIR'] || '/var/log/fluentd'
log_type = "#{ENV['LOG_TYPE'] || 'nginx.access nginx.error rails'}".split
bq_project_id = ENV['BQ_PROJECT_ID'] || 'bq-test'
bq_json_key = ENV['BQ_JSON_KEY_PATH'] || '/app/.key/bq_key.json'
# for bq tuning.
flush_interval = ENV['BQ_FLUSH_INTERVAL'] || "1s"
try_flush_interval = ENV['BQ_TRY_FLUSH_INTERVAL'] || 0.05
queued_chunk_flush_interval = ENV['BQ_QUEUED_CHUNK_INTERVAL'] || 0.01
buffer_chunk_limit = ENV['BQ_BUFFER_CHUNK_LIMIT'] || "768k"
buffer_queue_limit = ENV['BQ_BUFFER_QUEUE_LIMIT'] || 2048
buffer_chunk_records_limit = ENV['BQ_BUFFER_CHUNK_RECORDS_LIMIT'] || 300
num_threads = ENV['BQ_NUM_THREADS'] || 4
## bigquery
log_type.each do |t|
match ("uuid.service.#{app_env}.#{app_name}.#{t}") {
type :copy
store {
type :bigquery
method :insert
buffer ('time') {
flush_interval flush_interval
timekey '1d'
}
inject {
time_key :time
time_type :string
time_format '%Y-%m-%d %H:%M:%S'
timezone 'Asia/Tokyo'
}
auth_method :json_key
json_key bq_json_key
project bq_project_id
dataset "#{app_env}_#{app_name}_streaming"
auto_create_table :true
table "#{t.sub('.', '_')}%Y%m%d"
try_flush_interval try_flush_interval
queued_chunk_flush_interval queued_chunk_flush_interval
buffer_chunk_limit buffer_chunk_limit
buffer_queue_limit buffer_queue_limit
buffer_chunk_records_limit buffer_chunk_records_limit
num_threads num_threads
schema_path "/app/config/schema/#{t.sub('.', '_')}_schema.json"
insert_id_field :uuid
}
}
end
match ('**') {
type :null
}
```
:::
---
#### Deploy
:::info
```
$ dokku apps:create fluentd
$ git init
$ git remote add dokku dokku@192.168.33.10:fluentd
$ git push dokku master
```
:::
---
## Example 4: [Embulk](https://github.com/embulk/embulk)
---
#### Dockerfile
:::info
```
FROM jruby:9.1.17-alpine
RUN echo "http://dl-cdn.alpinelinux.org/alpine/edge/main" >> /etc/apk/repositories
RUN apk add --no-cache python3 py-pip jq gzip curl bash tzdata coreutils && \
python3 -m ensurepip && \
rm -r /usr/lib/python*/ensurepip && \
pip install --upgrade pip setuptools && \
if [ ! -e /usr/bin/pip ]; then ln -s pip3 /usr/bin/pip ; fi && \
rm -r /root/.cache
RUN pip install awscli
ENV EMBULK_VERSION 0.8.39
RUN curl -kL https://bintray.com/artifact/download/embulk/maven/embulk-${EMBULK_VERSION}.jar -o /opt/embulk.jar
RUN mkdir /app
WORKDIR /app
COPY embulk /app/embulk
COPY Gemfile* /app/
RUN /usr/bin/java -jar /opt/embulk.jar bundle install --path vendor/bundle
COPY pigeonhole /app/
COPY restore /app/
COPY restore_alb /app/
COPY common /app/
CMD ["/bin/bash", "-c", "tail -f /dev/null;"]
```
:::
---
#### Gemfile
:::info
```
source 'https://rubygems.org/'
gem 'embulk-output-bigquery', '~> 0.6.0'
gem 'embulk-input-s3', '~> 0.3.0'
gem 'embulk-parser-jsonl', '~> 0.2.0'
gem 'embulk-formatter-jsonl', '~> 0.1.4'
gem 'embulk-parser-regex', '~> 0.2.0'
gem 'embulk-parser-grok', '~> 0.1.7'
gem 'embulk-filter-eval', '~> 0.1.0'
gem 'embulk-parser-ltsv', '~> 0.1.1'
gem 'tzinfo-data', platforms: [:jruby]
```
:::
---
#### Config yml ([liquid template](https://shopify.github.io/liquid/))
:::info
```
{% assign log_suffix = '' %}
{% capture log_type_template %}type/{{ env.LOG_TYPE }}{% endcapture %}
{% capture log_type_filter_template %}type/{{ env.LOG_TYPE }}_filter{% endcapture %}
{% capture default_bucket %}{{ env.PROFILE }}-app-{{ env.APP_NAME}}-logs{% endcapture %}
{% capture default_dataset %}{{ env.PROFILE | append: "_" | append: env.APP_NAME | append: "_bulkload" }}{% endcapture %}
{% capture default_table %}{{ env.LOG_TYPE | append: env.SUFFIX_HOUR | append: env.SUFFIX_DATE }}{% endcapture %}
exec: {}
in:
type: s3
{% include log_type_template %}
{% capture s3_path_regex %}{{ log_suffix | append: env.SUFFIX_TIME | append: ".*\.gz$" }}{% endcapture %}
bucket: {{ custom_bucket | default: default_bucket }}
path_prefix: {{ custom_path_prefix | default: env.DATE_PATH }}/
path_match_pattern: {{ custom_match_pattern | default: s3_path_regex }}
endpoint: s3-ap-northeast-1.amazonaws.com
{% if env.AWS_ACCESS_KEY_ID %}
auth_method: env
{% else %}
auth_method: instance
{% endif %}
{% include log_type_filter_template %}
out:
type: bigquery
auth_method: json_key
json_keyfile: {content: '{{ env.BQ_JSON_KEYFILE_CONTENT }}'}
auto_create_table: true
auto_create_dataset: true
dataset: {{ env.BQ_DATASET | default: default_dataset }}
table: {{ env.BQ_TABLE | default: default_table }}
mode: {{ env.OUT_MODE | default: "replace" }}
path_prefix: /tmp/{{ default_dataset }}
formatter: {type: csv, header_line: false}
allow_quoted_newlines: true
file_ext: .csv.gz
compression: GZIP
abort_on_error: false
max_bad_records: 100
```
:::
---
#### Deploy
:::info
```
$ dokku apps:create embulk
$ git init
$ git remote add dokku dokku@192.168.33.10:embulk
$ git push dokku master
```
:::
---
#### Run command
:::info
```
$ dokku run embulk ${script}
### scripts
/usr/bin/java -jar /opt/embulk.jar run -b . embulk/template.yml.liquid
```
:::
---
## docker-options
- `-p source_port:target_port`
:::info
```
$ dokku docker-options:add fluentd1 run -p 24224:24224
$ dokku docker-options:add fluentd2 run -p 24225:24224
```
:::
---
## Ansible dokku
https://github.com/dokku/ansible-dokku
---
## DNS
http://xip.io/
---
## Finally
***It's great to use `Heroku` if you've budget!***
:money_with_wings:
---
# *Thank you!*
{"metaMigratedAt":"2023-06-15T07:23:20.303Z","metaMigratedFrom":"YAML","title":"Introducing Dokku ([@tei-k](https://github.com/tei-k))","breaks":true,"slideOptions":"{\"transition\":\"slide\"}","contributors":"[{\"id\":\"dd4902e2-7613-466b-a8f3-4069069f62fd\",\"add\":13253,\"del\":3042}]"}