Skip to content

Commit cb71c4c

Browse files
committed
init
1 parent 3bfd4f3 commit cb71c4c

21 files changed

Lines changed: 987 additions & 1 deletion
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
2+
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
3+
4+
name: Python package
5+
6+
on:
7+
push:
8+
branches: [master, dev]
9+
pull_request:
10+
branches: [master]
11+
12+
jobs:
13+
build:
14+
runs-on: ubuntu-latest
15+
strategy:
16+
matrix:
17+
python-version: [3.8, 3.9]
18+
19+
steps:
20+
- uses: actions/checkout@v2
21+
- name: Set up Python ${{ matrix.python-version }}
22+
uses: actions/setup-python@v2
23+
with:
24+
python-version: ${{ matrix.python-version }}
25+
- name: Install devDependence
26+
run: |
27+
python -m pip install --upgrade pip
28+
pip install mypy pycodestyle coverage lxml
29+
- name: Install dependencies
30+
run: |
31+
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
32+
- name: Lint with pep8
33+
run: |
34+
pycodestyle --max-line-length=140 --ignore=E501 --first --statistics kafkaproxy
35+
- name: Type Hint Check
36+
run: |
37+
mypy --ignore-missing-imports --show-column-numbers --follow-imports=silent --check-untyped-defs --disallow-untyped-defs --no-implicit-optional --warn-unused-ignores kafkaproxy
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
# This workflows will upload a Python Package using Twine when a release is created
2+
# For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries
3+
4+
name: Upload Python Package
5+
6+
on:
7+
release:
8+
types: [created]
9+
10+
jobs:
11+
deploy:
12+
13+
runs-on: ubuntu-latest
14+
15+
steps:
16+
- uses: actions/checkout@v2
17+
- name: Set up Python
18+
uses: actions/setup-python@v2
19+
with:
20+
python-version: '3.x'
21+
- name: Install dependencies
22+
run: |
23+
python -m pip install --upgrade pip
24+
pip install setuptools wheel twine
25+
- name: Build and publish
26+
run: |
27+
python setup.py sdist bdist_wheel bdist_egg
28+
- name: 'Upload dist'
29+
uses: 'actions/upload-artifact@v2'
30+
with:
31+
name: packages
32+
path: dist/*
33+
- name: Publish
34+
env:
35+
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
36+
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
37+
run:
38+
twine upload dist/*

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -127,3 +127,5 @@ dmypy.json
127127

128128
# Pyre type checker
129129
.pyre/
130+
zk
131+
kafka

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# v0.0.1
2+
3+
项目创建

MANIFEST.in

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
include LICENSE
2+
include README.md
3+
include CHANGELOG.md
4+
include requirements.txt
5+
recursive-include kafkaproxy *.pyx *.pxd *.pxi *.py *.c *.h *.temp *.jinja

README.md

Lines changed: 140 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,141 @@
11
# kafkaproxy
2-
kafka的代理对象
2+
3+
kafka生产者和消费者的代理工具.
4+
5+
代理对象用于推迟初始化.我们可以在需要的地方用代理对象的全局变量直接编写逻辑,避免被代理的对象来回在函数间传递.
6+
7+
## 特性
8+
9+
+ 支持代理`kafka-python`,`aiokafka``confluent-kafka`的生产者消费者对象.
10+
+ 提供统一通用的生产者消费者接口包装
11+
12+
## 安装
13+
14+
+ 只安装本项目不安装被代理对象的依赖: `pip install kafkaproxy`
15+
+ 安装本项目同时确定要代理的对象包为`kafka-python`: `pip install kafkaproxy[kafka]`
16+
+ 安装本项目同时确定要代理的对象包为`aiokafka`: `pip install kafkaproxy[aio]`
17+
+ 安装本项目同时确定要代理的对象包为`confluent-kafka`: `pip install kafkaproxy[confluent]`
18+
19+
## 使用
20+
21+
本项目支持代理3种kafka模块中的对应模块,使用枚举`KafkaType`中的取值在调用方法`initialize_from_addresses`初始化时指定.
22+
代理对象除了原样代理对象外还提供了生产者和消费者的统一通用接口包装.
23+
由于对应的方法是动态绑定的,因此如果需要他们的typehints可以用`typing.cast`将代理对象转化为对应的协议对象
24+
25+
+ 同步接口生产者使用`ProducerProtocol`
26+
+ 异步接口生产者使用`AioProducerProtocol`
27+
+ 同步接消费产者使用`ConsumerProtocol`
28+
+ 异步接消费产者使用`AioConsumerProtocol`
29+
30+
> 代理`kafka-python``confluent-kafka`生产者
31+
32+
```python
33+
from kafkaproxy import ProducerProxy, KafkaType, ProducerProtocol
34+
from typing import cast
35+
import time
36+
kafkap = ProducerProxy()
37+
38+
39+
def run() -> None:
40+
p = cast(ProducerProtocol, kafkap)
41+
with p.mount() as cli:
42+
for i in range(10):
43+
cli.publish("topic1", f"send {i}")
44+
time.sleep(0.1)
45+
46+
47+
# kafkap.initialize_from_addresses("localhost:9094", kafka_type=KafkaType.ConfluentKafka, acks="all")
48+
kafkap.initialize_from_addresses("localhost:9094", kafka_type=KafkaType.Kafka)
49+
try:
50+
print("start publishing")
51+
run()
52+
finally:
53+
print("stoped")
54+
```
55+
56+
> 代理`kafka-python``confluent-kafka`消费者
57+
58+
```python
59+
from kafkaproxy import ConsumerProxy, KafkaType, ConsumerProtocol
60+
from typing import cast
61+
62+
kafkac = ConsumerProxy()
63+
64+
65+
def run() -> None:
66+
c = cast(ConsumerProtocol, kafkac)
67+
with c.watch() as g:
68+
for record in g:
69+
print(record.value)
70+
71+
72+
# kafkac.initialize_from_addresses("localhost:9094", "topic1", group_id="test2", kafka_type=KafkaType.Kafka)
73+
kafkac.initialize_from_addresses("localhost:9094", "topic1", group_id="test2", kafka_type=KafkaType.ConfluentKafka)
74+
try:
75+
print("start watching")
76+
run()
77+
finally:
78+
print("stoped")
79+
80+
```
81+
82+
> 代理`aiokafka`生产者
83+
84+
```python
85+
import asyncio
86+
from kafkaproxy import ProducerProxy, KafkaType, AioProducerProtocol
87+
from typing import cast
88+
89+
kafkap = ProducerProxy()
90+
91+
92+
async def run() -> None:
93+
p = cast(AioProducerProtocol, kafkap)
94+
async with p.mount() as cli:
95+
for i in range(10):
96+
await cli.publish("topic1", f"send {i}")
97+
await asyncio.sleep(0.1)
98+
99+
100+
async def main() -> None:
101+
kafkap.initialize_from_addresses("localhost:9094", kafka_type=KafkaType.AioKafka, acks="all")
102+
await run()
103+
104+
105+
try:
106+
print("start watching")
107+
asyncio.run(main())
108+
finally:
109+
print("stoped")
110+
111+
```
112+
113+
> 代理`aiokafka`消费者
114+
115+
```python
116+
import asyncio
117+
from kafkaproxy import ConsumerProxy, KafkaAutoOffsetReset, KafkaType, AioConsumerProtocol
118+
from typing import cast
119+
120+
kafkac = ConsumerProxy()
121+
122+
123+
async def run() -> None:
124+
c = cast(AioConsumerProtocol, kafkac)
125+
async with c.watch() as g:
126+
async for record in g:
127+
print(record.value)
128+
129+
130+
async def main() -> None:
131+
kafkac.initialize_from_addresses("localhost:9094", "topic1", group_id="test2", kafka_type=KafkaType.AioKafka, auto_offset_reset=KafkaAutoOffsetReset.earliest)
132+
await run()
133+
134+
135+
try:
136+
print("start watching")
137+
asyncio.run(main())
138+
finally:
139+
print("stoped")
140+
141+
```

docker-compose.yml

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
version: "2.4"
2+
services:
3+
# kafka相关工具
4+
# zookeeper:
5+
# image: docker.io/bitnami/zookeeper:3.8
6+
# ports:
7+
# - "2181:2181"
8+
# volumes:
9+
# - "zookeeper_data:/bitnami"
10+
# environment:
11+
# - ALLOW_ANONYMOUS_LOGIN=yes
12+
# kafka:
13+
# image: docker.io/bitnami/kafka:3.3
14+
# ports:
15+
# - "9092:9092"
16+
# volumes:
17+
# - "kafka_data:/bitnami"
18+
# environment:
19+
# - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
20+
# - ALLOW_PLAINTEXT_LISTENER=yes
21+
# depends_on:
22+
# - zookeeperk
23+
kafka:
24+
image: 'bitnami/kafka:latest'
25+
ports:
26+
- '9094:9094'
27+
environment:
28+
- KAFKA_ENABLE_KRAFT=yes
29+
- KAFKA_CFG_PROCESS_ROLES=broker,controller
30+
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
31+
- KAFKA_CFG_LISTENERS=CONTROLLER://:9093,CLIENT://:9092,EXTERNAL://:9094
32+
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,CLIENT:PLAINTEXT,EXTERNAL:PLAINTEXT
33+
- KAFKA_CFG_ADVERTISED_LISTENERS=CLIENT://kafka:9092,EXTERNAL://localhost:9094
34+
- KAFKA_BROKER_ID=1
35+
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@127.0.0.1:9093
36+
- KAFKA_CFG_INTER_BROKER_LISTENER_NAME=CLIENT
37+
- ALLOW_PLAINTEXT_LISTENER=yes
38+
- KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true
39+
- KAFKA_CFG_LOG_RETENTION_HOURS=12
40+
# - KAFKA_AUTO_CREATE_TOPICS_ENABLE=yes
41+
# - KAFKA_LOG_RETENTION_HOURS=12
42+
# - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CLIENT:PLAINTEXT,EXTERNAL:PLAINTEXT
43+
# - KAFKA_CFG_LISTENERS=CLIENT://:9092,EXTERNAL://:9093
44+
# - KAFKA_CFG_ADVERTISED_LISTENERS=CLIENT://kafka:9092,EXTERNAL://localhost:9093
45+
# - KAFKA_CFG_INTER_BROKER_LISTENER_NAME=CLIENT

kafkaproxy/__init__.py

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
"""kafka生产者和消费者的代理工具.
2+
3+
代理对象用于推迟初始化.我们可以在需要的地方用代理对象的全局变量直接编写逻辑,避免被代理的对象来回在函数间传递.
4+
本项目支持代理3种kafka模块中的对应模块,使用枚举`KafkaType`中的取值在调用方法`initialize_from_addresses`初始化时指定.
5+
代理对象除了原样代理对象外还提供了生产者和消费者的统一通用接口包装.
6+
由于对应的方法是动态绑定的,因此如果需要他们的typehints可以用`typing.cast`将代理对象转化为对应的协议对象
7+
8+
+ 同步接口生产者使用`ProducerProtocol`
9+
+ 异步接口生产者使用`AioProducerProtocol`
10+
+ 同步接消费产者使用`ConsumerProtocol`
11+
+ 异步接消费产者使用`AioConsumerProtocol`
12+
"""
13+
14+
from .consumer import ConsumerProxy
15+
from .producer import ProducerProxy
16+
from .models import (
17+
ConsumerRecord,
18+
KafkaType,
19+
KafkaAutoOffsetReset
20+
)
21+
22+
from .protocols import (
23+
ProducerProtocol,
24+
AioProducerProtocol,
25+
ConsumerProtocol,
26+
AioConsumerProtocol
27+
)

0 commit comments

Comments
 (0)