Explain how to configure and run the sample.
If you want to run the sample on Windows, OSX, or Linux, you need to following tools.
- Azure Function Core Tools (v3 or above)
- Python 3.8
- AzureCLI
However, If you can use DevContainer, you don't need to prepare the development environment. For the prerequisite for the devcontainer is:
- Docker for Windows or Docker for Mac
- Visual Studio Code
- Visual Studio Code - Remote Development extension
DevContainer will set up all of the prerequisites includes AzureCLI with local Kafka Cluster.
Go to the samples/python
directory then open the Visual Studio Code.
$ cd samples/python $ code .
Visual Studio might automatically ask you to start a container. If not, you can click the right bottom green icon (><), then you will see the following dropdown.
Select Remote-Containers: Reopen in Container.
It starts the DevContainer, wait a couple of minutes, you will find a java development environment, and a local Kafka cluster is already up with Visual Studio Code.
This sample contains three functions. Kafka Cluster
local means, it uses a Kafka cluster that is started with DevContainer.
Name | Description | Kafka Cluster | Enabled |
---|---|---|---|
KafkaTrigger | Simple Kafka trigger sample | local | yes |
KafkaTriggerMany | Kafka batch processing sample with Confluent Cloud | Confluent Cloud | no |
$ python -m venv .venv $ source .venv/bin/activate
$ pip install -r requirements.txt
If you want to use KafkaTriggerMany
sample, rename KafkaTriggerMany/function.json_
to KafkaTriggerMany/function.json
then Azure Functions Runtime will detect the function.
Then copy local.settings.json.example
to local.settings.json
and configure your Confluent Cloud environment.
If you want to run the sample on your Windows with Confluent Cloud and you are not using DevContainer, uncomment the following line. It is the settings of the CA certificate. .NET Core that is azure functions host language can not access the Windows registry, that it means can not access the CA certificate of the Confluent Cloud.
UserTriggerMany/function.json
"sslCaLocation":"confluent_cloud_cacert.pem",
For downloading confluent_cloud_cacert.pem
, you can refer to Connecting to Confluent Cloud in Azure.
This command will install Kafka Extension. The command refer to the extensions.csproj
then find the Kafka Extension NuGet package.
$ func extensions install
Check if there is dll packages under the target/azure-functions/kafka-function-(some number)/bin
. If it is success, you will find Microsoft.Azure.WebJobs.Extensions.Kafka.dll
on it.
Before running the Kafka extension, you need to configure LD_LIBRARY_PATH
to the /workspace/bin/runtimes/linux-x64/native"
. For the DevContainer, the configuration resides in the devontainer.json.
You don't need to configure it.
$ func start
Deploy the app to a Premium Function You can choose.
- Quickstart: Create a function in Azure using Visual Studio Code
- Quickstart: Create a function in Azure that responds to HTTP requests
Go to Azure Portal, select the FunctionApp, then go to Configuration > Application settings. You need to configure these application settings. BrokerList
, ConfluentCloudUsername
and ConfluentCloudPassowrd
are required for the sample. LD_LIBRARY_PATH
is required for Linux based Function App. That is references so library that is included on the Kafka extensions.
| Name | Description | NOTE | | BrokerList | Kafka Broker List | e.g. changeme.eastus.azure.confluent.cloud:9092 | | ConfluentCloudUsername | Username of Confluent Cloud | - | | ConfluentCloudPassword | Password of Confluent Cloud | - | | LD_LIBRARY_PATH | /home/site/wwwroot/bin/runtimes/linux-x64/native | Linux only |
Send Kafka events from a producer, and you can use ccloud command for confluent cloud.
$ ccloud login $ ccloud kafka topic produce message
For more details, Go to ccloud.
If you want to send an event to the local Kafka cluster, you can use kafakacat instead.
$ apt-get update && apt-get install kafkacat $ kafkacat -b broker:29092 -t users -P