How 3 SaaS Companies Built Real-Time Analytics

As SaaS companies build real-time analytics into their applications, development teams are challenged to deliver features with more demanding speed and scale requirements in a timely and cost-effective manner.

How have some SaaS companies navigated this? In this tech talk, we highlight the key considerations for SaaS companies implementing real-time analytics at scale:

  • Low query and data latency: A CRM company built a real-time customer 360 that joins data from multiple product lines. The company needed a solution that could deliver sub-second queries not just for simple searches but also for complex joins.
  • Ops costs of managing the solution at scale: A logistics company introduced a new product offering, an analytics suite, that allowed users to track shipments in real-time. The company had invested in serverless technologies and wanted an analytics database that was also easy to manage at scale.
  • Flexibility of the system to changes in data or queries: A security company wanted to augment high level metrics with ad-hoc queries on real-time data so that users could more easily assess risk in their organization. Given that the queries were not known ahead of time, the team needed a flexible system that could support constantly changing queries on security data that was only a few seconds old.

Speakers

Justin Liu is a Product Manager at Rockset, where he works closely with SaaS customers to create a seamless developer experience for building applications on Rockset. Prior to Rockset, he worked in engineering at Google on various teams including Google Cloud IAM and Data Protection.
Julie Mills is a Product Marketing Manager at Rockset and has previously worked with teams to adopt SaaS products at Wrike and Meltwater.

Show Notes

Julie Mills:

I'm going to go ahead and get started today with our tech talk on how three SaaS companies implemented real-time analytics into their applications.

Julie Mills:

Just a couple of housekeeping items before we get started. All participants will be muted for the duration of the tech talk. If you do have questions, please feel free to put them in the chat. We'll have a live Q&A at the end, and then we'll get to some of those questions.

Julie Mills:

I'm also recording the session today, and all participants will receive a copy of the recording after the tech talk as well.

Julie Mills:

Thanks for joining us. So I'm here with my colleague Justin this morning. Justin, do you want to say a quick hello and give an introduction?

Justin Liu:

Yeah, thank you. Hey everyone. My name is Justin. I work as PM here at Rockset. I'm primarily focused on creating a great developer experience, for those of you building applications with us. And prior to Rockset I was working as an engineer on Google Cloud.

Julie Mills:

Great. And I'm Julie. I'm on the product marketing side of the house. And prior to Rockset I spent about four and a half years at SaaS companies helping teams adopt and implement sales and marketing solutions. So I'm excited to pull from some of my learnings in SaaS into the tech talk that we have today.

Julie Mills:

So here is an agenda for what we're going to cover. We'll be talking about the rise of real-time analytics, and trends that we're seeing in the space. We'll also go over three SaaS companies that implemented real-time analytics, and then get into Rockset's tech, and how it enables real-time analytics, before opening it up for a question and answer period.

Julie Mills:

So we're seeing a lot of growth in real-time analytics. That ranges from companies that are delivering snappy, interactive experiences within their application, to doing semi-autonomous or autonomous machine learning processes. And to give you a taste of what we're covering today, we'll be talking about a logistics company in the construction space that gave their suppliers and their customers greater visibility into shipments and their ETAs, so they can better plan downstream activities.

Julie Mills:

We'll also be talking about a security company that gave their users ad hoc analytics, or an ability to drill down into specific vulnerabilities, to better understand and assess risk for their organization.

Julie Mills:

And then lastly, we'll be talking about a CRM company that joined data from sales, support and marketing interactions, to provide a central customer view of all of that information.

Julie Mills:

One thing you'll note from each of the examples today is that all of these customers were giving their users real-time data and insight with the goal of taking immediate action. And that is a trend that we're seeing across the SaaS industry.

Julie Mills:

We're seeing a huge growth in real-time analytics. The number of SaaS companies are actually dedicated to building just analytics and AI. In the security space COVID has pushed many companies to work from home, and security teams are being tasked with protecting a much larger area of infrastructure, including email, home offices, as well as their network environments. And they're doing that at the same time that there's a wave of more sophisticated cyber attacks. And so more companies are looking toward security analytics solutions to help them navigate that.

Julie Mills:

On the logistics side, a McKinsey survey showed that 85% of respondents really struggled with inefficient digital technologies in their supply chain. And so more companies are looking towards greater insight, and also looking at new areas of risk that are popping up as a result of COVID. So with that we're seeing companies like Flexport come to market, where they're bringing end-to-end visibility into the supply chain.

Julie Mills:

The last area that we'll be talking about is more on the sales and marketing side. So in those types of SaaS companies we're seeing a lot of growth within conversational bots, personalization efforts, as well as more hyper-focused targeting solutions and analytics.

Julie Mills:

Gong, for example, in the revenue space, is helping to increase productivity of sales teams by automating a lot of the manual processes of updating their CRM solution. So we're seeing with Slack and Gong and other solutions that AI and analytics is really faster and greater productivity on those teams.

Julie Mills:

So what is real-time analytics? We'll be touching on four main characteristics of real-time analytics. The first is low data latency. And this is the time from when data is generated to when it is available for analytics. For example, with a logistics company, they want to do real-time route optimization using the latest GPS, weather and inventory data to optimize routes. If there is a delay in getting that data it may result in sub-optimal route decisions.

Julie Mills:

There's also a move for low query latency. So application users want speedy, snappy, responsive applications that they are querying and interacting with. One our B2B customers set their standard for real-time analytics query latency is it needs to be the speed of Instagram. So if you think about Instagram, you're scrolling on the app, it's showing you relevant pictures and videos from users on that app, and that's all coming through using an algorithm.

Julie Mills:

The other area is complex analytics. So that Customer 360 example that I gave is a great example of complex analytics. You need to join and aggregate data across multiple product lines to be able to better understand relationships. And this requires systems that can support large-scale aggregations and joins, as well as search.

Julie Mills:

And then lastly is its scale. If you're a SaaS company, you want to have the same snappy, responsive experience for your customers as you're scaling the number of users in your application.

Julie Mills:

So what are some of the challenges that are facing application builders? The first challenge is that analytics systems were not designed for speed. Many analytics systems were built for batch and slow queries, and so it's challenging to retrofit these systems for the millisecond latency queries requirements of real-time analytics, and to do that in a compute-efficient way.

Julie Mills:

There's also growth in constantly-changing semi-structured data. So as a SaaS company we're seeing many start with an initial machine learning algorithm, or a set of analytics, that they're embedding into their application, and they want to be able to expand those capabilities over time. But iterating is challenging when there's constantly-changing semi-structured data that requires a significant amount of performance engineering to get those latency requirements that you need.

Julie Mills:

And then the last challenge is just the complexity of operating systems at scale. So many companies that we've worked with have said they've managed large-scale distributed data systems, and they just don't want to do it again. They want to keep their lean engineering teams focused on building their apps, and not on managing infrastructure.

Julie Mills:

So we're seeing developers want systems that are fast, flexible and easy for real-time analytics.

Julie Mills:

I'm going to pass it over to Justin now to talk a little bit more about how a logistics company successfully navigated these challenges. Justin, over to you.

Justin Liu:

Cool. Thank you. Awesome. So as Julie mentioned, we are going to take a look at... This is the first of the three SaaS companies we'll look at. This is going to be one of the world's largest construction logistics company.

Justin Liu:

And so first, let's take a look at the description on their application. And so this construction logistics service company is really looking to build a service that really centralizes all the data and transactions relevant amongst everyone inside the supply chain. And so what this means is everyone who is inside their supply chain, we'll get to it a little bit later, but this includes their buyers and suppliers and their transporters. They want all the information that they want relevant onto a single platform. And they really sell this application on three main things. But the deal with this is they really facilitate every single step of the workflow, from the initial order placement all the way until the final bill is paid.

Justin Liu:

And the first thing that they really offer here is centralization. And so as you can imagine, products and devices generate data points throughout the supply chain. And all the materials on every single part, on whether order, produce, transported and placed on every single step of the way, all the way from the materials, when they're actually being mined and created and processed, all the way to the construction site. And every single step of the way there's so many data points, and they really put all of that on one single platform, which didn't really exist before in the construction world. And that's one huge point they sell on.

Justin Liu:

And second is really this end-to-end visibility. These were for managers who are trying to oversee this massive operation with so many different stakeholders involved. With this application, at any point in time, you're taking a snapshot, and see where every single part, all the moving parts are inside your supply chain.

Justin Liu:

And finally, the thing that they really sell on is this real-time aspect. And where this is really, really important, we'll show this in a moment. But as you can imagine, you're creating, working on a construction project, and you have a wet concrete truck that is moving to a construction site. If it doesn't get to exact location at the exact time, that entire company truck, for instance, could break, and the entire construction project could be jeopardized. And so to see some real-time, and being on-time, with actionable information, is super, super important for this application.

Justin Liu:

All right. Next here we'll just take a look at some of the potential people involved inside this supply chain line. And I picked some icons here that may or may not be relevant. But you can take a quick look here. There's powder sites, liquid sites and aggregates you're getting transported to all these material processing factories, which there are a lot of kinds. I'm no construction expert. But these are some of the stakeholders they hold. Finally, those are going to have to get looped to the construction sites, and there's a lot of people who are going to have to be looking at this platform, including, obviously, the managers, but also all the contractors working at the construction sites. And all the people in the office, like accountants, and other people who really in one place they want to be looking, at any point for this entire project be able to manage the whole thing in one place.

Justin Liu:

So next we're going to take a look at some of the actual challenges that they face. And we'll talk about, back here in the tech land, what their architecture looks like. So you can see, I took a quote from this customer. And he really told us they're on DynamoDB right now, it's resting in a complete serverless architecture. And their problem right now is they're trying do some complex searches on DynamoDB for this new platform, and they're looking for something that's a little faster and more scalable.

Justin Liu:

And there's two really, really big requirements here that makes this application particularly challenging to build. The first is really the flexibility of the system. And what this really means is... There's really two aspects to this. First is data, which itself is, obviously, as you know, DynamoDB is a NoSQL database. The data, it's based on data, it's semi-structured. It's not nicely-schema-ed, as you might expect on a typical relational database.

Justin Liu:

And part of the issue with the type of application they're build is you can never really predict from their customers what new projects they require. Every single project is so different. All their customers are different. The people outside the supply chain are so flexible. Not even the application itself, the logistics service, can predict what they're going to put into their application as data. And so just data is just constantly changing. You can't always predict what is going to happen.

Justin Liu:

And same with queries. These queries they perform are not pre-programmed queries always. They're often very ad hoc and the little queries that customers are running. And that's really what makes this so difficult. They don't really have the luxury of reshaping and optimizing their data, or defining an index in their database to really optimize for speed, or partition their data a certain way, which is what you might do if you really had the time to tune your queries and tune your database, so that you can optimize a particular query, or optimize for the scheme of your data. They really don't have that flexibility here. And so they need something that it's all flexible, and can really account for anything that's coming its way.

Justin Liu:

And the second piece here that's really, really challenging for them is that at this massive scale, and all these requirements, they need it to still be in real-time. They care about the data latency, which is from the time the data is actually created until it's query-able. And the actual query latency, how long it takes for the actual query to come back with a response. Both of them have to be really, really fast. As we were talking about earlier, this could be anything. It could really jeopardize entire construction projects if your data is stale, or the query is taking too long. Your materials will arrive at the wrong location at the wrong time, they take a sub-optimal route. Anything could really jeopardize the entire construction project. So it's really important that at any point all of data is accurate in real-time, and the queries are returning in a prompt manner as well.

Justin Liu:

So after looking at this problem, let's take a look at some of the options they looked at. So they know that they have to support these real-time analytics case on their logistics service here. And so currently, as we mentioned, they're already on DynamoDB. That's their primary LT database. They're not going to come off of that.

Justin Liu:

And so, really, their options are trying to figure out what they can do with that. So the first thing they tried is, actually, just running these queries inside of DynamoDB. And they were running complex analytical queries with, they're joining their data and running all these aggregations. They tried to do it into DynamoDB, and it really did not work for them. In fact, one of the first reasons they knew they were going to have to look outside of DynamoDB is, while they were actually onboarding one of their biggest customers they literally could not vertically scale their DynamoDB enough. Just when they trying to run these joins, they ran out of memory. No matter how big their size was of their database, they simply couldn't support it. And so they knew they to look for other options.

Justin Liu:

And that's when they looked at these two other options. They basically tried to say, "So why don't we offload these costly queries, these analytical queries, through a secondary index?" And that's when they took a look at two options here. They looked Elastic, and they looked at Rockset. In fact, when they came to us they actually told us they were in the middle of POC and really close to signing with Elastic.

Justin Liu:

And you can see how that might work here. So for both Elastic and Rockset we basically replicate the data inside your DynamoDB, and then you can just run the queries on Elastic or on Rockset instead. And with Elastic you just configure your DynamoDB change streams, and through a bit of work you can actually set it up so that Elastic has a copy of all your DynamoDB data, and then you can begin to run queries on that.

Justin Liu:

And similarly with Rockset, you can use our click to connect DynamoDB connector, which will automatically read those change streams, copy the data, and you can just run simple queries on Rockset as well.

Justin Liu:

So ultimately they ended up deciding to go with Rockset. And we're just going to talk about a few of the reasons why here. They let us know that they found Rockset to be particularly good for them.

Justin Liu:

And the first was really that they were really, really used to conventional SQL. And this was a key piece for them, they were saying, because this is completely new to them. They had never used this type of index before. They didn't want to have to learn how to do a whole bunch of new things with new frameworks and languages, how you can query your data. But all of their engineers were, of course, very familiar with conventional SQL. They were very happy to use their conventional SQL joins and aggregation commands that they were so used to. And with Rockset, we just support normal SQL. And so they were really, really comfortable doing that, which saved them a lot of time.

Justin Liu:

And secondly, they're really, really happy about the queries at scale that we're able to run for them. And really for the first time, what's really unique about using Rockset is you can actually horizontally scale your compute, which is really, really unique here. Because you can actually scale your compute and storage separately, such that if you increase the amount of CPUs running for your compute, you can actually, in fact, the speed of your queries will basically go up, depending on how many CPUs you're assigning to execute on those. And that's only that you have whole flexibility with Rockset here. As you scale they actually pick the price and performance that they really wanted was their sweet spot for how much they wanted to pay, how much compute they wanted to support for the type of query performance that they were looking for.

Justin Liu:

And finally was this time to market piece, which is really a result of the first two points as well. We have this awesome quote here that they gave us I'd love to share. And they basically said that they were able to decrease their engineering roadmap from six months to one weekend. And part of the reason that they were able to do this is because our click to connect was so easy. They didn't have to manually configure anything. Once they connected it to DynamoDB, which is literally a five-minute configuration, it automatically referenced the data in Rockset, and they didn't have to do anything else. And they can just write SQL on it. They didn't have to learn anything. They just clicked a few things and starting writing SQL in their DynamoDB. And for them this was super great. They were able to save that one customer that they wanted to onboard before, and they were really able to get their time to market down.

Justin Liu:

So next I'll hand it over back to Julie, who will talk about our security analytics use case.

Julie Mills:

Awesome. Thanks, Justin.

Julie Mills:

So there's a cloud security service that offered continuous monitoring and alerts to SaaS applications. And so for example, one of their applications that they did monitoring and alerting for was Zoom. And so users of their application wanted to ask questions, such as, when and where did the user log into Zoom. And so that's an example of a search question that they would ask.

Julie Mills:

They also wanted to build notifications as well. So if a security team member was not currently in the application, they could still be notified of vulnerabilities. So for example, alerting a security team on a number of failed login attempts.

Julie Mills:

And then lastly, they wanted to give their users an ability to create their own custom dashboards, so they could figure out what was most important to them to analyze and mitigate security risks with applications like Zoom.

Julie Mills:

So what did their data stack look like for real-time analytics? So they had application data. So that, for example, could be coming through Zoom, Slack, Microsoft, et cetera. They also had device data on their users as well. So they streamed that data through Kafka. They also stored that data in a data lake and warehouse, where they built some of their machine learning algorithms. Rockset actually has a built-in connector to Kafka, so it can sync data from that source, automatically index that data, and then serve customer-facing real-time analytics, as well as alert to notifications.

Julie Mills:

So on the customer-facing real-time analytics side, a user could access the security application, and they could be able to filter down for specific information. So based on the application or the user, they could see their log-ins, any vulnerabilities that they have, and be able to filter through that information. And then on the alert side, if there were different security thresholds that were set, those would trigger those alerts to that security team.

Julie Mills:

So what were some of their challenges implementing this SaaS solution? So one of their challenges was around ad hoc analytics. So as a security analytics company, your bread and butter is really on the quality and the comprehensive nature of your analytics. And users not only wanted machine learning, they also wanted the ability to slice and dice that data to better dissect where was their risk within their organization, and if a user had seen a risk being done, some sort of risk vulnerability in one application, was it also happening in other applications as well? And then so that requires an ability to support ad hoc querying, without knowing the query patterns ahead of time.

Julie Mills:

They also wanted to join data. So if you were, as a user, using Slack or Zoom, you wanted to join that data together. And the customer was currently using Elasticsearch to power the search and analytics features of their application. But they were struggling to implement the complex logic into their application, because Elasticsearch does not natively support joins. So it's taking hundreds of lines of application code, when actually they could've done that with 10 lines of SQL. And so that was causing them to take a lot more time to market to introduce new analytics capabilities.

Julie Mills:

And then they also had an operational overhead. So they had adopted Elasticsearch, but they were realizing, as the number of customers we're scaling and growing, that they were going to need to double down on their devops team to be able to manage that infrastructure. And they wanted to redirect a lot of their engineering effort towards their core application, and away from infrastructure management.

Julie Mills:

So why did this company ultimately look at Rockset? So one was the sub-second query latency. So they wanted to make it really easy if a security team member was in their application, they could ask multiple questions, get the response to those questions in milliseconds, and to really cut down the amount of time that it would take to make a decision on how to proceed with a security vulnerability. And so as they tested and used Rockset they were able to see that we could support counts. So that could look at number of logins, number of failed login attempts, et cetera, joins across these applications, searches, and then also building aggregate metrics based on sub-metrics, to assess risk. And all of those queries were able to return with sub-second query latency.

Julie Mills:

They were also able to give more ad hoc querying capabilities within their application, and really this aspect of self-service analytical queries. And with Elasticsearch it was taking a lot of time to massage the data, as well as to configure indexes to get the performance that was needed to support ad hoc query capabilities. Whereas Rockset's converge indexing approach, indexes, any of your data three different ways, to support a number of query patterns out the gate.

Julie Mills:

And then last thing is accelerated time to market. So by the time they started their trial of Rockset to when they implemented their analytical features was a month. And so they were really keen on how they could easily expand and iterate on their analytics over time, because of the ease of use.

Julie Mills:

So now I'm going to move on to a CRM application that use Rockset for real-time analytics. So this customer relationship management application, they wanted to offer a new product offering that would aggregate data from their sales, marketing and support products, to create a single customer view. So for example, if a user was to log in, they could look up a customer profile, and they would see everything associated with that profile, such as website interactions, support tickets, sales meetings. And then they could also enrich that data with third-party datasets on can contact our company information, like location or employee size.

Julie Mills:

They then wanted to be able to segment those customers based on that profile. So for example, you could be creating a top-tier segment of accounts for sales. And that could be based off of things like employee size, geography, industry, and sales input as well. And then based off of tiers and segmentations it could automate workflows. So you could do different low-touch or high-touch actions to different segments across those product lines and channels.

Julie Mills:

So what does their data stack look like for real-time analytics? So their different products were siloed. So they had the support product, sales product and marketing product, and they streamed that data through Kafka into downstream applications and databases like Rockset. And then Rockset was able to join that data, index it and serve it for their Customer 360, and their segmentation as well.

Julie Mills:

So what were some of the challenges in implementing real-time analytics? So the data is siloed across those multiple product lines, and joining that data across those products was slow. So the company had initially implemented a batch-oriented system with support for joins, but they were starting to see that join performance get into multiple seconds, which was beyond the FLA requirements of this new application.

Julie Mills:

They also were scaling their customers. So they had started this new product offering, and as their number of customers grew on the product their query performance was faltering. And so they were looking at tens of seconds for searches and aggregation performance as well as they scaled.

Julie Mills:

And then lastly, they were also looking for multi-tenancy, and offering different performance SLAs, as well as different analytics capabilities for different tiers of their own customer base. And so they wanted a solution that could support multi-tenancy.

Julie Mills:

So as this company had initially had architected a solution for real-time analytics, they were realizing that, even with a lot of performance engineering and tuning of that data management system, it still wasn't able to deliver on the performance that was required of the application.

Julie Mills:

So what were some of the benefits of Rockset? So one was that sub-second query latency. So it was the ability to, even with these really wide aggregations and joins on data, to be able to still meet that sub-second query latency requirement for the application.

Julie Mills:

The other aspect of it was low data latency. So as I mentioned before, their initial solution was batch-oriented, and they wanted real-time inserts, updates and deletes. So for example, if I was to go into a platform for my customer, I would want to see that profile in it to be in real-time. So if they'd updated their email address, or if they'd just been added to a new campaign, I would want to see that information. Otherwise, I might have some confusion as to whether or not they were in the campaign that I just added them to or not.

Julie Mills:

And then lastly was ease of use. So I mentioned that this team actually did have a very robust data team that could manage a lot of infrastructure, and did manage successfully a lot of infrastructure in-house. And so they were willing to put resources to making technologies work. And one of the things they recognized as they moved to Rockset was that they were able to sync data directly from their source with click and connect data connectors. And they also really liked that they could upgrade or downgrade compute resources based on the volume of customers that they had, as well as the performance requirements for different customers.

Julie Mills:

So I'm going to go ahead and turn it back to Justin to talk a little bit around Rockset's tech.

Justin Liu:

Awesome. Thanks. So next we're going to into a little bit about the behind-the-scenes under the hood of Rockset's technology, to discuss and show you a little bit about what we do that makes all these cool benefits that Rockset offers possible.

Justin Liu:

So the first thing we're going to look at is something we call our converge index. And the converge index is something, you may have heard us talk about it before. But it's really that, what we do is, when your data gets ingested into Rockset, it gets indexed automatically in at least three different ways, every single field. And this includes and inverted index, which is useful for point look-ups, a columnar index, which is useful for aggregations, and a row index, which is useful for data retrieval. And your data can be indexed further as well. We can sometimes create range indexes, which are useful for range scans with low selectivity. You can also optionally create special geo indexes, and many more.

Justin Liu:

And really the advantage to unlock and creating all of these indexes is really two-fold. One, obviously, it's just really, really fast. Our optimizer will pick which indexes to use during query execution that it needs to be the fastest way to execute your query, which is part of what allows our query latencies on Rockset to run so fast, and return responses so quickly despite how complex they might be in structure.

Justin Liu:

And secondly, what this is really nice for is it makes it super, super easy on the user side that you don't have to worry about shaping your data or defining your indexes, because Rockset really does all that for you. We automatically index everything. Our optimizer automatically chooses the right indexes to use. And really on the user side, all you have to do, again, is you click to connect, and after that you can just begin writing in SQL, and you don't have to worry about all the tuning that you might have to with some other services around the same space.

Justin Liu:

And second, one of the really, really cool things about Rockset is it combines some of worlds of NoSQL and SQL data here. You'll notice that a lot of data that we ingest can be totally schemaless. We talked about this earlier, where we can ingest any structured data from, for instance, DynamoDB and Mongo, which are completely NoSQL data, or even data lakes, like S3 GCS. And when they come into Rockset they may not have a schema ahead of time, but Rockset somehow lets you run SQL commands, which it seems that your data is relational, on the schema.

Justin Liu:

So how did it actually do that? So Rockset actually has something called smart schemas, which essentially infers the schema based on the data as it's coming in. And so it basically independently selects the type of every single value, and strongly types it inside our backend. So once it's actually ingested in Rockset, every single field will have a datatype that's inferred. And during query execution that's what makes these queries able to run so quickly, is these smart schemas that Rockset uses have already inferred a schema on your data. They apply it to it when it's ingested into Rockset, so that when actually run your queries all of the fields themselves have datatypes that are strongly typed, and can be queried for.

Justin Liu:

And lastly, what makes Rockset so special is really our serverless data architecture here. And here you can take a quick look at what our backend looks like. Your data really comes into our tailors, which get indexed into our leaf nodes. And then when you write queries they get picked up by our aggregators, which aggregate and aggregate and aggregate, and until they get served in your application.

Justin Liu:

And really some of it, there's a couple of really, really key advantages to the way Rockset is architected. One being that you can really just scale your compute and storage separately, which we touched on a few times in this presentation now. But what's really, really awesome about this is you pay for what storage you'd like to, and outside of that you could also pay for what compute you'd like to, but they're completely separate. And so you don't have to scale one with the other, which really allows you to not only cost-optimize your computing storage cost, obviously, but also you can really find that sweet spot for what you would like for your query performance, where if you doubled the number of CPUs running on the read side of this architecture, you literally can get double the read performance. And that's really a huge advantage that get here with Rockset.

Justin Liu:

And outside of this storage-compute separation, this entire architecture is serverless. And so you don't have to worry about anything. We're not just talking about you don't have to fine-tune your data, or your databases, or anything like that, or your queries. All of this stuff is fully managed. So you don't have to worry about scaling. You don't have to worry about keeping your servers on and off. Rockset automatically does all of that for you. And so you really don't have to worry about any of that. And so this is totally super-low operational burden, and it's really, really low-maintenance to run on Rockset. It's just super easy to get started and keep going.

Justin Liu:

Awesome. So that's the end of our presentation. Thank you all for coming. We've shared a couple links here. One for you can see our requesting a demo signup, start a free trial. But yeah, I think we'll stick around here for a few more minutes to do some Q&A. And I'll hand it back over to Julie.

Julie Mills:

Great. So feel free to go ahead and chat in any questions that you have, and we can answer those for you as they come up.

Julie Mills:

So we have one question. What tools are teams using for visualizations and dashboards in their applications? Justin, could you take that question for me?

Justin Liu:

So that is a really popular use case for people who are using Rockset, is a lot people who are trying to use different kinds of visualizations tools, or even BI analytics sometimes, to just take a quick look at their data. Some of the most popular ones that we've seen are Redash and Retool. You can really build some nice graphs on there. We have full actual, inside our documentation at docs.rockset.com. Under Query Your Data there's a whole section on visualization tools. I think the top two are probably Redash and Retool. Tableau is also super, super popular. So you can directly just visualize your data inside any of those three. We also have support integrations with Grafana, Power BI. I think that's Microsoft. And, actually, Superset. So we have integrations with all of those visualization tools.

Julie Mills:

Great. Thanks. I also got a question around, how do companies choose between using Elasticsearch or using Rockset for different use cases? So I will go ahead and take that question as well on this one. So for Elasticsearch, it was originally built for text search use cases. And so for that example you're looking at supporting 10 different types of languages, and making that incredibly search-efficient. Rockset was originally built for real-time analytics, and we talk a little bit more around this idea of structured search. So you might know the fields that you want to query ahead of time, or have specific fields that you want to query, and Rockset is really good at being able to find the needle in the haystack search queries. So while both solutions were built for a real-time analytics use case, we just differ a little bit in terms of our specialization in how we've optimized our solutions.

Julie Mills:

And let me see, is there any other questions that we're getting?

Julie Mills:

So I have one question around a lot of the AWS infrastructure is all about events, and if there's plans to have support for that, either subscriptions to events. Awesome. Thank you so much for your question. I appreciate it. Let me circle back with you more on our product roadmap offering, and I can talk more around different areas of support that we have in the future for different types of data sources.

Julie Mills:

Justin, do you want to chat a little bit about, if it's not one of the data sources that we support, how you can potentially bring in your data to Rockset? I think that's sometimes a common question that we get.

Justin Liu:

Definitely. So obviously we have a list of seven or eight supported data sources, where we automatically will ingest your data and then keep it in sync. If your data is not inside one of those sources, one, or you just want to manage that simply on your own, so you don't want Rockset to do the thinking for you, you do have the option of using our write API. And our write API is, essentially, it's just an API endpoint. It's actually a subset of API endpoint inside the Rockset API, which are used for insert, update and delete documents in your Rockset collections. So if you want to stream your own data directly into Rockset, you could take a look at the write API documentation, which is also in Rockset's docs.

Justin Liu:

Something to keep in mind here is Rockset is not a transactional database. It's not really intended to be your primary LT database for that reason. It's really meant for running your real-time analytics. And so you want to be really careful to not put your transactional data that you're depending on inside Rockset in that manner.

Julie Mills:

Awesome. Thanks, Justin. I'll give a couple of minutes for any last questions. But if anyone wants to jump off, go ahead and do that. Thank you all for joining us today, and remember that you can request a demo with us or start a free trial. And please go ahead and do that. We'd love for you to give us a try for your real-time analytics use cases.

Julie Mills:

I actually think that's it on the questions. Thanks, Justin. I appreciate you joining me today.

Justin Liu:

Awesome. Thanks.

Julie Mills:

Take care. Bye, everyone.


Recommended Webinars