Ctrl+K
Logo image Logo image

Site Navigation

  • API Reference
  • Examples

Site Navigation

  • API Reference
  • Examples

Section Navigation

  • PyFlink Table
    • TableEnvironment
    • Table
    • Data Types
    • Window
    • Expressions
    • User Defined Functions
    • Descriptors
    • StatementSet
    • Catalog
  • PyFlink DataStream
  • PyFlink Common

pyflink.table.table_environment.TableEnvironment.set_python_requirements#

TableEnvironment.set_python_requirements(requirements_file_path: str, requirements_cache_dir: Optional[str] = None)[source]#

Specifies a requirements.txt file which defines the third-party dependencies. These dependencies will be installed to a temporary directory and added to the PYTHONPATH of the python UDF worker.

For the dependencies which could not be accessed in the cluster, a directory which contains the installation packages of these dependencies could be specified using the parameter “requirements_cached_dir”. It will be uploaded to the cluster to support offline installation.

Example:

# commands executed in shell
$ echo numpy==1.16.5 > requirements.txt
$ pip download -d cached_dir -r requirements.txt --no-binary :all:

# python code
>>> table_env.set_python_requirements("requirements.txt", "cached_dir")

Note

Please make sure the installation packages matches the platform of the cluster and the python version used. These packages will be installed using pip, so also make sure the version of Pip (version >= 20.3) and the version of SetupTools (version >= 37.0.0).

Parameters
  • requirements_file_path – The path of “requirements.txt” file.

  • requirements_cache_dir – The path of the local directory which contains the installation packages.

New in version 1.10.0.

previous

pyflink.table.table_environment.TableEnvironment.scan

next

pyflink.table.table_environment.TableEnvironment.sql_query

Show Source

Created using Sphinx 4.5.0.