Flink projects can be built with different build tools. In order to get quickly started, Flink provides project templates for the following build tools:
These templates help you to set up the project structure and to create the initial build files.
You can scaffold a new project via either of the following two methods:
In order to build your project you simply have to issue the sbt clean assembly
command.
This will create the fat-jar your-project-name-assembly-0.1-SNAPSHOT.jar in the directory target/scala_your-major-scala-version/.
In order to run your project you have to issue the sbt run
command.
Per default, this will run your job in the same JVM as sbt
is running.
In order to run your job in a distinct JVM, add the following line to build.sbt
We recommend using IntelliJ for your Flink job development.
In order to get started, you have to import your newly created project into IntelliJ.
You can do this via File -> New -> Project from Existing Sources...
and then choosing your project’s directory.
IntelliJ will then automatically detect the build.sbt
file and set everything up.
In order to run your Flink job, it is recommended to choose the mainRunner
module as the classpath of your Run/Debug Configuration.
This will ensure, that all dependencies which are set to provided will be available upon execution.
You can configure the Run/Debug Configurations via Run -> Edit Configurations...
and then choose mainRunner
from the Use classpath of module dropbox.
In order to import the newly created project into Eclipse, you first have to create Eclipse project files for it.
These project files can be created via the sbteclipse plugin.
Add the following line to your PROJECT_DIR/project/plugins.sbt
file:
In sbt
use the following command to create the Eclipse project files
Now you can import the project into Eclipse via File -> Import... -> Existing Projects into Workspace
and then select the project directory.
The only requirements are working Maven 3.0.4 (or higher) and Java 8.x installations.
Use one of the following commands to create a project:
There will be a new directory in your working directory. If you’ve used
the curl approach, the directory is called quickstart
. Otherwise,
it has the name of your artifactId
:
The sample project is a Maven project, which contains two classes: StreamingJob and BatchJob are the basic skeleton programs for a DataStream and DataSet program. The main method is the entry point of the program, both for in-IDE testing/execution and for proper deployments.
We recommend you import this project into your IDE.
IntelliJ IDEA supports Maven out of the box and offers a plugin for Scala development. From our experience, IntelliJ provides the best experience for developing Flink applications.
For Eclipse, you need the following plugins, which you can install from the provided Eclipse Update Sites:
If you want to build/package your project, go to your project directory and
run the ‘mvn clean package
’ command.
You will find a JAR file that contains your application, plus connectors and libraries
that you may have added as dependencies to the application: target/<artifact-id>-<version>.jar
.
Note: If you use a different class than StreamingJob as the application’s main class / entry point,
we recommend you change the mainClass
setting in the pom.xml
file accordingly. That way, the Flink
can run time application from the JAR file without additionally specifying the main class.
Write your application!
If you are writing a streaming application and you are looking for inspiration what to write, take a look at the Stream Processing Application Tutorial
If you are writing a batch processing application and you are looking for inspiration what to write, take a look at the Batch Application Examples
For a complete overview over the APIa, have a look at the DataStream API and DataSet API sections.
Here you can find out how to run an application outside the IDE on a local cluster.
If you have any trouble, ask on our Mailing List. We are happy to provide help.