Beliebte Suchanfragen

Cloud Native

DevOps

IT-Security

Agile Methoden

Java

//

Creating integration flows with the Reedelk Data Integration Platform

3.9.2020 | 8 minutes of reading time

The integration of data from systems of record or legacy systems is one of the elements of a software development project that does not start on a greenfield. In other words, it can help modernize software. Usually the question arises how to transfer the addressed data into the “new” world without much effort.

ESB, ETL and ELT are not dead

If you look at the use case schematically, it is basically about ETL (Extract-Transform-Load) or ELT (Extract-Load-Transform) processes. Nowadays, Kafka is very quickly taken as a super weapon or panacea, and this blog post will show that this is not always the best solution. The Enterprise Service Bus has a long tradition in software development.

And it is exactly this functionality that the Reedelk Data Integration Platform wants to map in a simple way. The product is relatively new on the market, more precisely as a community edition since the end of June 2020.

IntelliJ as a development environment provides the foundation and is extended with a plugin, the Reedelk Flow Designer (https://www.reedelk.com/documentation/intellijplugin). A similar model is offered by MuleSoft and RedHat, where both rely on Eclipse and also deliver a complete development environment. However, the Flow Designer is currently closed source for the platform (community edition), but it is available free of charge. The created runtimes can be deployed in the cloud in various ways. Detailed information can be found in the documentation (https://www.reedelk.com/documentation/runtime). The option to create a Docker container is delivered directly with each runtime project created and thus the different deployment types can be created via build pipeline mechanisms. Since version 1.0.1 OpenAPI Import Tool has also been integrated, thus the platform now supports the API-first approach.

An integration project

Now we want to take a look at the platform with the help of a demo project. The idea here is the concrete implementation of the use case mentioned at the beginning. Our fictitious customer has an old legacy application whose data should now be made available to newer applications with the help of APIs. The data model looks like this:

The setting

The demo is based on the PostgreSQL example DB “Airlines”. The legacy system is an old booking system. The main entity is a booking (bookings). A booking can include several passengers, where a separate ticket (tickets) is issued for each passenger. A ticket has a unique number and contains information about the passenger. As such, the passenger is not a separate entity. Both the name of the passenger and the number of his or her ID card may change over time, making it impossible to uniquely identify all tickets for a particular person; for the sake of simplicity, we can assume that all passengers are unique. The ticket includes one or more flight segments (ticket_flights). Several flight segments may be included in a single ticket if there are no non-stop flights between the point of departure and destination (connecting flights) or if it is a round-trip ticket. Although there is no restriction in the scheme, it is assumed that all tickets in the booking have the same flight segments. Each flight (flights) goes from one airport (airports) to another. Flights with the same flight number have the same origin and destination, but differ in the departure date. At flight check-in, the passenger is issued a boarding pass on which the seat number is indicated. The passenger can only check in for the flight if this flight is included in the ticket. The combination of passenger and seat must be unique to avoid issuing two boarding passes for the same seat. The number of seats (seats) in the aircraft and their distribution among different classes depends on the model of the aircraft (aircrafts) making the flight. It is assumed that each aircraft model has only one cabin configuration.

Thus, the initial situation would be clear. The task now is to make the booking system easier to integrate with the help of Reedelk. In addition to the aforementioned integration platform, IntelliJ, OpenAPI and Docker are also used. In order to start, a Reedelk project is created in IntelliJ.

Within the project, an appropriate PostgreSQL database, which represents the legacy system, is made available via Docker and a Docker Compose file.

1version: "3"
2services:
3  bookings:
4    image: postgres:11.7-alpine
5    environment:
6      - POSTGRES_HOST_AUTH_METHOD=trust
7    ports:
8      - 5432:5432
9    volumes:
10      - ./db/10_init.sql:/docker-entrypoint-initdb.d/10_init.sql

API first

Next, the OpenAPI specification for the bookings entity is created. The result is shown in the following codeblock.

1openapi: 3.0.3
2info:
3  title: Bookings API
4  description: API for Bookings
5  version: 0.1.0
6servers:
7  - url: http://localhost
8paths:
9  /bookings:
10    get:
11      summary: Returns a list of bookings.
12      description: A list of bookings.
13      responses:
14        '200':
15          description: OK
16          content:
17            application/json:
18              schema:
19                type: array
20                items:
21                  $ref: '#/components/schemas/Booking'
22              example:
23                - book_ref: 00004A
24                  book_date: 2016-10-13 18:57:00.000000
25                  total_amount: 29000.00
26                - book_ref: 00006A
27                  book_date: 2016-11-05 02:02:00.000000
28                  total_amount: 106100.00
29        '404':
30          description: Error
31    post:
32      summary: Creates new booking.
33      description: A new booking is created.
34      requestBody:
35        content:
36          application/json:
37            schema:
38              $ref: '#/components/schemas/Booking'
39            example:
40              book_ref: 12345A
41              book_date: 2020-08-26 07:51:00.000000
42              total_amount: 10000.00
43      responses:
44        '201':
45          description: Booking created
46        '404':
47          description: Error
48  /bookings/{booking_ref}:
49    get:
50      summary: Returns a booking by booking_ref.
51      description: An unique booking by booking_ref.
52      responses:
53        '200':
54          description: OK
55          content:
56            application/json:
57              schema:
58                type: array
59                items:
60                  $ref: '#/components/schemas/Booking'
61              examples:
62                bookingGETExample:
63                  $ref: '#/components/examples/bookingGetExample'
64        '404':
65          description: Error
66    put:
67      summary: Updates an existing booking.
68      description: An existing booking will be updated.
69      requestBody:
70        content:
71          application/json:
72            schema:
73              $ref: '#/components/schemas/Booking'
74            examples:
75              bookingPUTExample:
76                $ref: '#/components/examples/bookingPutExample'
77      responses:
78        '200':
79          description: OK
80        '404':
81          description: Error
82components:
83  schemas:
84    Booking:
85      properties:
86        book_ref:
87          type: string
88        book_date:
89          type: string
90          format: 'date-time'
91        total_amount:
92          type: number
93          format: double
94  examples:
95    bookingGetExample:
96      value:
97        book_ref: 00006A
98        book_date: 2016-11-05 02:02:00.000000
99        total_amount: 106100.00
100    bookingPutExample:
101      value:
102        book_ref: 00006A
103        book_date: 2016-11-05 02:02:00.000000
104        total_amount: 10600.00

The API import tool integrated in the Flow Designer is currently only able to process the sample data as inline samples.

Via Tools\Import Open API the following four flows can now be generated from the spec.

  • GETBookings
  • GETBookingsBooking_ref
  • POSTBookings
  • PUTBookingsBooking_ref

By using the API-first approach, the corresponding examples from the specification were stored within the flow in addition to the flows, so that the integration service created can be made available directly to consumers of the service in the stage for development and testing, as a container using Docker, for example. This now also enables asynchronous and parallel development of clients and service.

Let the integration flow

At this point, the task is to create the connection to the database for GETBookings flow. This can be done by the Flow Designer or in a textual way. A flow is always based on an ‘event’. This can be a consumer, listener or scheduler. This ‘event’ then triggers a chain of further actions.

First of all, access to the database is now required. For this purpose the SQL select component is dragged into the flow. In the next step the added component has to be configured. To do so, a global configuration file for the connection to the database is created in the Flow Designer interface. Afterwards only the appropriate SQL statement must be entered.

As the data should be transferred to the client in JSON format, the Object-to-JSON component is required and must be placed behind the SQL component in the flow by means of drag-and-drop.

The first flow for querying all bookings through an API is now complete. This is a simple flow that does not contain any business logic. It transforms the data into the JSON format.

By starting the Reedelk Runtime, the created flow can be tested locally. To test the API endpoint, Insomnia Designer will be used, as it allows importing the specification via a URL. The Reedelk Runtime provides such an endpoint for the created service under the URL http:/localhost:8484/v1/openapi.yaml.

When the GET endpoint is queried, the database is relatively big. The table bookings contains over 2 million entries. The specification already contains the endpoint bookings/{booking_ref}, which can be used to query the booking for a specific booking reference. We can replace the existing mock with two components (SQL Select and Object-to-JSON), so this flow also returns a desired JSON as response. The select statement in the SQL Select component has to be configured according to the screenshot.

Two integration flows have already been created with the Reedelk Runtime and Platform, repsectively, based on an OpenAPI specification. The challenge of the large amount of data for the endpoint /bookings will be considered separately in an upcoming blog post.

Creating a service for the cloud

Finally the created integration service should be provided as a container. Based on Maven, this is not too much effort. With mvn clean package, a package is created. A Docker file is already created when the project is started.

1FROM reedelk/reedelk-runtime-ce:1.0.5
2COPY target/*.jar /opt/reedelk-runtime/modules
3COPY reedelk-runtime/lib/*.jar /opt/reedelk-runtime/lib
4CMD runtime-start

In the next step the image can be built and then started.

1$ docker build -t booking-integration-service:0.1.0
2$ docker run -p 9988:9988 -p 8484:8484 booking-integration-service:0.1.0

Thus, the use case is solved for two API endpoints. The integration service is available, but access is currently completely uncontrolled. A proper control can be achieved by using an API gateway. An upcoming blog post will show what this could look like for the described use case. The sources for demo project are available at GitHub .

share post

Likes

0

//

More articles in this subject area

Discover exciting further topics and let the codecentric world inspire you.

//

Gemeinsam bessere Projekte umsetzen.

Wir helfen deinem Unternehmen.

Du stehst vor einer großen IT-Herausforderung? Wir sorgen für eine maßgeschneiderte Unterstützung. Informiere dich jetzt.

Hilf uns, noch besser zu werden.

Wir sind immer auf der Suche nach neuen Talenten. Auch für dich ist die passende Stelle dabei.