VANTIQ Resources

Product Tour – Video 1: Application Builder

The VANTIQ videos in this series are designed to show developers and operations managers the simplicity of building real-time, event-driven applications with VANTIQ. A specific use-case involving real-time field service of IoT-connected machines is discussed but please keep in mind that ANY event-driven application can be easily built in VANTIQ, supporting a wide range of use-cases.

This video focuses on providing a brief description of the VANTIQ platform, and includes a step by step look at building a simple application. The application takes multiple streams of data into the platform and correlates them to identify a typical situation by predicting a machine in need of maintenance.

Video Transcript:

Effective digital business transformation requires increasingly powerful applications to work closely with humans to address complex situations requiring experience and intuition.

VANTIQ provides the only high-productivity platform for event-driven applications focused on the rapid development and deployment of systems enabling real-time collaboration between humans and machines.

Those involved in creating the next generation of digital business applications will benefit from dramatically reduced time-to-market, significantly lower development and maintenance costs, and maximum agility in response to dynamic market requirements.

In this multi part VIDEO series we are going to build a working, Event driven system that handles multiple streams of data, and human-machine collaborations through both mobile and web experiences.  We’ll also integrate Natural Language Processing and Machine Learning.

We’ll start with a generic IOT example taking in a stream of data from multiple Temperature and RPM Sensors that we want to monitor.   Either one of these values by itself might not mean much but together they can indicate a malfunction or a problem that is likely to occur.  We’ll assume in this example we are monitoring pumps.  It’s also important to note that in this example the sensors for Temperature and Revolutions Per minute, come from two different streams and at two different times, so. we’ll need to correlate these values.

First, we’ll take a look at the live data stream. I’ll start by creating a new project in VANTIQ and I’ll call it the PumpProject.

Our data is coming into the VANTIQ platform VIA a REST call, for example like the CURL CALL you see on screen, so let’s visualize these streams of data so we can get a sense of what kind of information is arriving.

I’ll add a subscription for both the temperature and rpms.  As you can see, once I add them they show up in the resource graph.  This resource graph is a place to see all the parts of your project that you are working on.

By clicking on the subscriptions I can listen to the topics they are connected to. I see a live view of the data as it streams into the platform. You can see we have 5 different sensors for both temperature and rpms, and as mentioned previously they don’t come in at quite the same time.

Now that we can see our data, lets build the application that correlates it for us.

I’ll come back over to the ADD menu on the right and click ADD -> APP and press New APP.

In the APP’s name box I’ll put “PumpApp”.

You can see the APP by default has an empty event Stream.  By Clicking on the yellow box I can configure this first event stream and call it SenseTemp.

And then I’ll configure the stream.

Since we know data is coming into the system as a topic, I’ll choose topics as the inbound resource, we also know from the resource graph above that the topic name is /stream/temp so we’ll set that as well and press OK.

We can also visualize the events as they pass through our application.

Lets save the APP and place it in debug mode to see the events as they pass through our application.

The badge decoration on the event stream  allows us to see the number of temperature sensor events as they pass through our application.

Let’s also set one up for the rpms stream.

I’ll right click on the existing stream and choose add event stream and then we’ll configure the rpms stream as we did before.

I’ll call this one, SenseRPMs.

Set the inbound resource to topics and the inbound resource ID to the topic name /stream/rpms.

We’ll save our work so we can see the data stream through our application

You can even see here, that our data for temperature and RPMs doesn’t arrive at the same rate.  And how do we know which temp sensor goes with which rpm sensor. Further more, how do we know where these device are even located. We need to enrich the data with supplemental information that can help us

I’ve created a simple lookup table in the product (though it could just as easily be stored in external system) and this clearly shows us which sensors are associated with which pumps, and the pumps locations.  We’ll enrich one of our streams with this data so that we can correlate them together

We’ll right click on SenseTemp and choose the enrich Activity Pattern and we’ll call it enrichData

Now we’ll configure this  activity.  We’ll associate it with our lookup table that we have called lookups.

We’ll also need to identify a foreign key to determine which object in our stream we’ll use to lookup the specific pump. In this case, it’s an object called tempSensorId, since our stream contains this object and so does our lookups table. If the stream and internal type name didn’t match we could always perform a transformation.

Let’s save the application so we can visualize when we are enriching the stream. As you can see by the badge decoration we are. This produces an object that contains both the SenseTemp stream AND the enriched data from our lookups type. We can see its now easy to correlate the temperature sensor with the rpm sensor and know which pump they belong to as well as the location of these items.

Now that we’ve got all the data we need, we’ll join the temp stream with the rpm stream, but remember these don’t arrive at the same time. Its important for us to specify a time window in which the correlation of these sensors is meaningful. After all, if they came in an hour apart, its unlikely both values would still be  true.

To do this we’ll add a join activity and I’ll call it joinStreams we’ll configure the constraint by saying that we’ll join the right SenseRPMs.rpmSensorId with the left enrichData.lookups.rpmSensorId and we’ll say that this correlation needs to occur within 11 seconds or the data is invalid.

Now we’ll simply link the SensRPM’s stream with the join activity by linking to an existing task and we’ll save our work.

Our data is now correlated and we want to persist this most current correlation  inside VANTIQ. To do this we need to put it into a format that makes it ready for inserting or updating it into the database.  We have a type already called pumps which has 5 properties: pumpId, location, rpms, temp, and time.

We’ll add a transformation to the data called transformData and configure it. We’ll leave the transformation type as visual. Since we know we need these 5 properties  we’ll click the  Add button 5 times.

We want the object we create to match the database fields so we’ll add: pumpId, location, rpms, temp, and time to the outbound property.

We need to specify an expression to determine what values will go in our new properties.

For time, we’re not interested in the time of the temp sensor, or the time of the rpm sensor, but rather the time these events were correlated (though if we wanted we could store all values) so for time we’ll use a built in function for the current time called now.

For temp we’ll use event.enrichData.temp for rpms.event.SenseRPMs.rpms location will be from the lookups data so thats event.enrichData.lookups.location and the pumpId is also from our lookups data event.enrichData.lookups.pumpId.

On many of the other screens you’ve scene a number of other optional parameters and we’re going to use them now.

We want this data to be updated into our pumps type so we’ll set the outbound resource to types and the outbound resourceID to pumps

Since we are only storing the last value, we want to update the record as opposed to creating new entities. We’ll check upsert which will insert a new record if one doesn’t exists and will update an existing one if it does.

We can see the data is now being persisted into the platform now that we have our data we can use this to find a situation on interest something that represents opportunity or threat to our business.

There are a number of ways we can do this.  In this example we’ll use an activity pattern called DWELL. Dwell is used to determine how long something has been in a certain state.

We’ll link a new task to our transformData activity and choose Dwell as the activity pattern. We’ll be looking for a condition where the temperature and rpms have been over certain values for some period of time. I’ll call this activity overUsage and we’ll configure it from our transformed object above setting the condition to  event.rpms > 4250 AND event.temp > 199 we’ll say that the condition must be true for 20 seconds or more we’ll also group this condition by the pumpId and save our work.

We won’t see an event for this yet since the incoming data is not exceeding these values when we have exceeded our dwell time we want to tell VANTIQ that this is a situation of interest. We’ll do that by linking a Detect Situation activity to our Dwell activity and we’ll call it Danger.

We only want the situation to occur once even if the DWELL, or any other activity type, is triggered multiple times for the duration the situation is open. To do this we’ll configure the situation.   We’ll set the entityType that we are associating with this situation with our pumps type, and constrain the type by specifying that the pumpId from our type will match the  pumpId from the event stream.

Our Application can now identify a dangerous situation on Pumps from the sensor data coming into the Platform.

 

Proceed to the next video in this series.

This website uses cookies to provide you with a better user experience. By using our site you agree to the use of cookies as described by our cookie policy. If you do not want to accept all cookies from our website, please see our cookie policy on how to modify the types of cookies that are accepted by your browser client.
VANTIQ Newsfeed
Washington D.C. – May 20, 2020
VANTIQ Recognized By SIAA CODiE Awards as Best Platform as a Service for 2020
VANTIQ is honored to have been named the Best Platform as a Service for the second time in three years by the SIAA CODiE Awards.
May 18, 2020
VANTIQ Unveils Back to Work Accelerator to Help in the Fight Against COVID-19
In response to a massive spike in demand for solutions that can help the world get back to work, VANTIQ today announced the Back-to-Work Accelerator: a product that enables software developers to rapidly build real-time applications for safeguarding workplaces against the spread of COVID-19.
Webinar
Ondemand Webinar - Rapid Innovation at the Front Lines: Why Real-Time Technology is Required for Life Science Manufacturing Businesses During COVID-19 Crisis
Watch this webinar by VANTIQ and Partner RoviSys to learn how real-time technologies can keep healthcare businesses operational. We talk about the needs of life science manufacturing businesses such as medical equipment manufacturing and pharmaceuticals, and how real-time applications are key for continued operations. With extensive knowledge in the industry, RoviSys discuss the trends and what kinds of solutions are necessary to survive in the time of crisis.
White Paper
RT Insights Special Report: Why Smart Cities Need an Event-Driven Architecture
The need for a future-proof architecture has never been greater. Read RT Insights new Special Report on Smart Cities and why EDA is key to future success.
This website uses cookies to provide you with a better user experience. By using our site you agree to the use of cookies as described by our cookie policy. If you do not want to accept all cookies from our website, please see our cookie policy on how to modify the types of cookies that are accepted by your browser client.