Apache Flink is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.
Learn more about Flink at https://flink.apache.org/
Apache Flink Python API depends on Py4J (currently version 0.10.9.7), CloudPickle (currently version 2.2.0), python-dateutil (currently version >=2.8.0,<3), Apache Beam (currently version >=2.43.0,<2.49.0).
Protobuf Code Generation
Protocol buffer is used in file
flink_fn_execution_pb2.pyi, the file is generated from
flink-fn-execution.proto is updated, please re-generate
flink_fn_execution_pb2.pyi by executing:
PyFlink depends on the following libraries to execute the above script:
Running Test Cases
Currently, we use conda and tox to verify the compatibility of the Flink Python API for multiple versions of Python and will integrate some useful plugins with tox, such as flake8.
We can enter the directory where this README.md file is located and run test cases by executing
To use your system conda environment, you can set
export FLINK_CONDA_HOME=$(dirname $(dirname $CONDA_EXE))
Create a virtual environment:
conda create -n pyflink_38 python=3.8
Then you can activate your environment and run tests, for example:
conda activate pyflink_38
pip install -r ./dev/dev-requirements.txt