# Redis pipeline 研究
---
## 討論案例
```ruby=
$noaof = $redis_noaof
user_agent = 'SRC'
redis_key = "xxx"
result = $noaof.pipelined do
exist = $noaof.exists? redis_key
$noaof.del(redis_key) if exist && user_agent == 'SRC'
end
```
---
## 印出來看看

---
## 結果 2個指令用一個指令搞定
```ruby=
user_agent = 'SRC'
redis_key = "xxx"
result = $noaof.del(dkey) if user_agent == 'SRC'
$noaof.set '123', '1'
$noaof.del '1234'
#=> return 0
$noaof.del '123'
#=> return 1
```
其實 del 返回就隱含key 是否存在的邏輯了.
---
## Benchmark
```ruby
$redis_noaof.set '123', '1'
cc = true
$redis_noaof.pipelined do |pipe|
dd = pipe.exists? '123'
pipe.del('123') if cc && dd
end
def bench(descr)
start = Time.now
yield
puts "#{descr} #{Time.now - start} seconds"
end
def without_pipelining
cc = true
$redis_noaof.set '123', '1'
10_000.times do
dd = $redis_noaof.exists? '123'
$redis_noaof.del('123') if cc && dd
end
end
def with_pipelining
cc = true
$redis_noaof.set '123', '1'
$redis_noaof.pipelined do
10_000.times do
dd = $redis_noaof.exists? '1234'
$redis_noaof.del('123') if cc && dd
end
end
end
bench('without pipelining') do
without_pipelining
end
bench('with pipelining') do
with_pipelining
end
```
---
## 快了 14 倍.

以上是沒有網路延遲的情況.
若是加上每次連redis 20ms 網路延遲, 那會更加巨大.
---
## Bouns
---
### Throughput per data size (傳輸bytes 不要超過 1500字元)

---
### request for second

---
## 參考網址
[Redis benchmark](https://redis.io/docs/reference/optimization/benchmarks/)
{"metaMigratedAt":"2023-06-17T01:26:24.998Z","metaMigratedFrom":"Content","title":"Redis pipeline 研究","breaks":true,"contributors":"[{\"id\":\"ddfe0d34-cd14-4d06-a64e-9ed56d0c1c64\",\"add\":1733,\"del\":57}]"}