When integrating data, you need fast and easy ways to extract data from any source system and get it into your target system — a MarkLogic database.
Apache NiFi was built to automate the flow of data providing a nice drag and drop, configurable user interface. MarkLogic supports its processors built for Apache NiFi, and our integration with Apache NiFi makes it a great choice for getting data into MarkLogic.
With Apache NiFi, you can use out-of-the-box processors to create data flows from relational databases such as MySQL or Postgres, Apache Kafka data streams and other sources in the Hadoop ecosystem, and many other data sources. If a processor doesn’t exist, you can build your own or you can create templates for common data flow patterns, then publish and share them for others to benefit and collaborate.
Watch this quick lightning talk where we demo how relational data can be loaded to MarkLogic in less than 5 minutes using Apache NiFi.
This screenshot shows a simple process to get relational data into MarkLogic. A SQL query is executed to get data out of a relational system. Then, a NiFi processor converts the resulting Avro serialization to data formatted as JSON, and that data is put into MarkLogic.
Azure Event Hub
Exchange Web Services
Google Cloud Bucket
JSON Path Reader
Windows Event Log