Concurrent processing with Mule Esb 3.9
In this chapter I will show you how use our mule to perform task in a parallel way. Commonly if we have a payload collection, we could use a simple for each and perform some task with each item of collection. Processing will be consecutive and it is good. But what happen if task takes a long takes a long time to finish and web have a collection with thousands of items?
Let’s imagine this requirement:
Receive an http post with a collection of employees and insert them all at once in a simple mysql database with these assumptions:
- Insertion order does not matter
- We will use more ram compared to a simple consecutive insertions
Possible Solutions
- #1 Mule Scatter Gather : https://docs.mulesoft.com/mule-runtime/3.9/scatter-gather
- #2 Mule Collection Splitter + Aggreator : https://docs.mulesoft.com/mule-runtime/4.2/migration-core-splitter-aggregator
- #3 Mule Asyn Strategy Flow with max threads : https://docs.mulesoft.com/mule-runtime/3.5/flow-processing-strategies
- #4 A simple JMS strategy
- #5 Mule Database bulk insert
I will choose the #2 alternative. Let’s start!!!
Database creation
- docker run -d -p 3306:3306 — name mymysqldb -e MYSQL_ROOT_PASSWORD=mypreciouspassword mysql:5.7 — character-set-server=utf8 — collation-server=utf8_unicode_ci
- docker exec -it mymysqldb bash
- mysql -uroot -pmypreciouspassword
- create database parallel;
- use database parallel;
- create table
create table employee(
id INT NOT NULL AUTO_INCREMENT,
name VARCHAR(100) NOT NULL,
lastname VARCHAR(40) NOT NULL,
job VARCHAR(40) NOT NULL,
creation_time DATETIME DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY ( id )
);
Mule Flow
- Create mule project with 3.9 runtime and maven nature
- Add the http endpoint and configure its connector for an url like http://localhost:8081/post
- Add a json to object transformer. Result class must be: java.util.ArrayList
- Add a Collection Splitter with default configurations
- Add mysql maven dependency : https://mvnrepository.com/artifact/mysql/mysql-connector-java/8.0.18
- Add a database connector with basic mysql configurations
- In the database connector configuration ui, choose insert: operation, not bulk and query type: parametrized
- In query textarea, add this line
INSERT INTO `employee`(name,lastname,job) VALUES (#[payload.name], #[payload.lastname], #[payload.job])
- Add a collection agreggator to wait until the insert of all rows
- Finally add a groovy transformer with some mock response and a object to json transformer
Mule flow could look like this:
Run
Just perform a simple http post request using postman, soapui o a simple curl command:
curl -d ‘[{“name”:”john”,”lastname”:”wick”,”job”:”murder”},{“name”:”john”,”lastname”:”mcclane”,”job”:”police”},{“name”:”john”,”lastname”:”rambo”,”job”:”swat”},{“name”:”john”,”lastname”:”connor”,”job”:”leader”}]’ -H “Content-Type: application/json” -X POST http://localhost:8081/post
Mule log will be:
You will see the employees in database with a very similar insert date:
Recommend lectures and Links
- Github source code : https://github.com/jrichardsz/mule-esb-snippets-templates/tree/master/parallel-procesing-demo
- (Home Image)https://sierranewsonline.com/wp-content/uploads/2019/05/Photo-3_NPS-Photo_Yosemite-Mounted-Patrol_2019-Bishop-Mule-Days-Featured-Image-620x330.jpg
- https://help.mulesoft.com/s/article/Concurrently-processing-Collection-and-getting-the-results
- https://pragmaticintegrator.wordpress.com/2014/01/31/message-throttling-with-mule-esb/
- https://dzone.com/articles/parallel-processing-in-mule-1
That’s all!!