Transcript:
SAP ELT Use Cases
Jeff Carr (Precog)
I’d like to welcome everybody to the webinar this morning or this afternoon. For some people getting more value from your SAP and non SAP application data. We’re going to wait just a Minute as people Continue to join you.
Okay, we’re going To Go ahead. Started like to welcome everybody to the webinar getting more value from your SAP and non-SAP application data. Today’s speakers are myself. I’m Jeff Carr. I’m the CEO of Precog. We’re an AI-powered data integration platform and SAP partner, Sami Haddad, who’s a senior sales engineer here at Precog, as well as Kat Chang SAP, Senior Director of Product Marketing, who is also presenting and speaking today. The agenda today will be a solution overview. We will discuss SAP Datasphere as well as Precog. We’ll then go through a number of use cases and how people are leveraging this technology and these platforms in order to get more value from their data. We’ll then go through a brief solution demonstration, and then we’ll also have a summary and Q and A at any point during a webinar, you can post questions in the chat of the webinar.
Cloud-based ELT
So the focus of this solution is around accessing web based or cloud-based applications in order to get data out and into the Datasphere for. Sources of analytics, machine learning, and more and more AI applications, as this slide points out, today, most companies when they acquire a new business application, it’s a cloud application. It runs in the cloud. It’s a SaaS application, which means, in order to access the data in that they need to go through the API that the vendor who created that application provides you can see on this slide that the usage of APIs has been growing at an extremely high rate over the last. You know, five to 10 years, typical enterprises will have many different applications they need to access through APIs. These include not only SaaS applications, but even public APIs, things that might be provided by a government as well as partners and even IoT applications. So that’s the challenge that companies are dealing with. It’s a highly technical challenge, as we know, API stands for Application Programming Interface. It’s a technical interface to the platform, and being able to access that and easily extract the data for purposes of analytics and machine learning can be a challenging technical problem, and with the Precog and SAP Datasphere platform, we’ve simplified that problem tremendously. I’ll now hand it over to Kat to walk us through the SAP Datasphere platform.
SAP goes over how they use Precog
Katryn Cheng (SAP)
Great. Thank you so much, Jeff. Great to see everyone here on the webinar. My name is Kat Chang, as Jeff just introduced. I’m an SAP product marketing for Datasphere and analytics cloud. I’ve been with SAP for over 20 years in various roles, but all around and all focused on data and analytics. For those of you who aren’t familiar with SAP Datasphere. Datasphere is a comprehensive data service built on our SAP BTP platform. It offers data integration, as you can see here, data integration, data Federation, analytic modeling, business semantics, data cataloging and self-service, data access and more. So it is a full data management platform. It also serves as a foundation for a business data fabric. What is a Data Fabric? An art is an architecture that enables seamless access to your business data without losing the semantics and business context. Now I’ll give you an example. So typically to prepare ERP data for analytics and AI, most organizations would have to extract that data out of the ERP system first and then rebuild all that lost semantics and context. This takes a lot of time and effort and resources. Datasphere eliminates this step by enabling access to this critical business data without losing all the important semantics and context so all the entities, measures, calculations, hierarchies, currencies and context are fully preserved, giving you more agility in your business and speed to insight. Users can then continue working with their business data within the Datasphere, creating analytic models and of course, blending that data, that business data with your non-SAP data as well data from other applications. Now, on that point, we understand that SAP customers have way more in their landscapes than just SAP. There is an enormous amount of new applications being delivered today, now appearing and being added to customers’ landscapes almost every day. And like Jeff said, these cloud applications will have APIs that allows you to to get that data out of that out of their systems. Otherwise, this would become a really big challenge for customers like, how would you get to that important data sitting in your new applications? So this is where our partner, Precog comes in. Jeff’s Next slide, please. Precog is natively built into the Datasphere and helps expand our connectivity to over 2000 cloud applications. This really helps our customers do more with their SAP data. Like Jeff said, this webinar is all about getting more value out of your SAP, non-SAP data. So combining all this critical information and data scattered throughout your landscape is a big game changer for most customers. So now I’ll hand it back to Jeff, who can speak more on the Precog integration with Datasphere.
Jeff talks about Precog and SAP
Jeff Carr (Precog)
Thanks, Kat. Appreciate it. So as you can see in this slide, and Kat pointed out, we’re tightly integrated with SAP Datasphere, and what we focus on is connecting to lots of SAP applications outside of the core s4 Hana implementations and things like SAP Ariba Concur success factors field glass and many, many others. But if you’ll notice on the left or the right hand side of the screen, we also are able to integrate with non-SAP applications, things like Workday and ServiceNow and Microsoft Dynamics and NetSuite and many, many other things. Our platform runs in the cloud. It’s a SaaS platform. It’s a 100% no-code platform. It doesn’t require any coding or manual mapping of data of any kind. It automatically. It does automatic schema detection. It automatically detects all custom fields, which is extremely common for people who have implemented SAP applications and non-SAP applications. It loads all of this data as analytics, ready AI, ready tables into SAP Datasphere so that you can then do the kinds of things that Kat was just mentioning around analytics, data governance, data catalog, and then obviously use that data in SAP analytics cloud as well. The platform is easily accessible and supports, as we pointed out, over 2000 different applications, almost any application that a business might need to use. And equally important is if there’s a new application that your business acquires, and you need to access that data and also load it into the Datasphere that can be added using our advanced AI into your existing platform in a matter of days. So now we’ll go ahead and walk through some customer success stories.
Obviously, I think this is one of the most important ways people can really get an understanding of how they can leverage this technology, by hearing how other companies have been successful with it. So the first one we’ll talk about is a major US utility. So this is a large utility company, they needed to get a 360 degree view of their human capital management, as you would expect. Utility companies are very people intensive, whether it’s people out working on the lines or in the plants, or even business people, the applications they need to needed to access in order to get these insights were SAPs for HANA, success factors, ECP, which is payroll field glass, which is their field, workforce management tool, Power Plan, which is an FP and a platform, workforce software, Qualtrics, SAP Cloud for Customer and SAP Emarsys, the solution that we implemented was Precog SAP Datasphere and SAP analytics cloud. We were able to deliver this solution in approximately 60 days, a little bit less, and the outcome they were able to achieve with that is a consistent view of their employees and workforce from an HR perspective. They’re able to capture and enhance all of their HR related analytics and modernize all of their data and analytics infrastructure. They provided a simplified data architecture, better resource allocation and improved employee and customer satisfaction. The next use case we’ll talk about is a global, European based nutrition supplier that needed to get better visibility into their supplier management projects. So the applications that they needed to access were s4 Hana SAP. Ariba, which was SAP’s procurement platform, and a platform called Intel x, which is an employee health and safety platform, the solution Precog, SAP Datasphere and SAP analytics cloud. What was the outcome? They were able to improve compliance for employee health and safety, implement supplier cost reductions and better compliance with supplier regulations and the ability to measure project effectiveness and manufacturing plant visibility. So some of the again, additional outcomes, supplier engagement, improved, internal optim processes were optimized, and again improved compliance with various policies around employees and employee health and safety. Another use case that will be successfully implemented is with a global mining company. So the solutions, or the platforms that they needed to be able to access were s for HANA, SAP Ariba, the procurement platform success factors, which is the SAP HCM platform, and then a platform called UKG Chronos, which is basically a timekeeping application, allows you to manage, you know, hourly employees. The solution, Precog SAP Datasphere and SAP analytics cloud.
The outcome, enhanced supplier onboarding, Contract Lifecycle performance and risk management and employee time card tracking details. So they were able to improve a number of different metrics around how they’re managing their workforce, their hourly employees and things of that nature. Increased business user empowerment, end-to-end, supplier visibility and employee productivity. So those are some examples of how we’ve implemented this technology and improved outcomes and analytics for these different companies, and you can see in a variety of different industries. I just wanted to take a moment and highlight some additional areas where we’ve been able to successfully access many different applications, many of which are not well supported by. Existing data integration vendors, areas like manufacturing and some of the systems you see there, with Siemens, OSIsoft, Kepware and Ge energy and mining, we have a lot of joint customers with SAP in the energy and mining sector. And in addition to supporting all of the SAP applications, we support additional applications, things like in virus, open invoice, energy, link and EIA, which is a government application. We support a wide variety of healthcare applications. We support a wide variety of different utilities applications. I mentioned it before government. APIs can be very rich with data. We have a number of our joint customers using things like the Federal Reserve data, which, if you’re trying to do various financial and FPA planning, that data can be very interesting for benchmarking, payment processing, fpna health or HCM, human capital management, security and CRM are just a few examples. This is by no means an exhaustive list, as we support over 2000 different applications, but we’re just giving you a flavor for some of the areas where we’ve been able to easily access these applications, load the data into Dataspheres, analytic-ready tables, and then keep that data up to date on whatever schedule the user chooses. I will now hand it over to Sammy Haddad, who will do a quick demonstration of Precog and Datasphere.
Sami talks about Precog works
Sami Haddad (Precog)
Thank you, Jeff, and Thanks all for joining us this morning and this afternoon. Wherever you may be, like Jeff said, I’m going to guide us quickly through the platform we’ll be on. We will flow data from Ariba into Datasphere, and I’ll show you what that data looks like in Datasphere. So let’s jump in here, like Jeff mentioned, Precog is a web only platform where you can access at Precog dot cloud. If you prefer to use our GDPR compliant version, you can go to EU dot Precog dot cloud as well. Signing in is quite straightforward. We support SSO in SAML and implementation as well for our large enterprise customers. For the time being, we’ll simply use a username and a password. Once you are logged in, Precog is going to prompt you to connect to one of our over 2000 available sources. Today, Jeff brought up some important sources that we’ve been you, that we’ve used to achieve success with many of our customers. So I’ll highlight those now. A great example would first be Kronos, the timekeeping app with Kronos workforce ready. Just to give you an idea of what it takes to connect, it’s usually no more than just a few pieces of credentials, and we’ll always provide instructions as to how to get those credentials. The other apps that Jeff was referring to include intellects. I’m happy to show you what that looks like here.
And last, but certainly not least, the whole point of this demonstration is that we can show you how to get more value from your SAP products as well. So let me show you what those SAP products look like. Looking at SAP Ariba at as for or from the beginning or from the top, one might think that SAP Ariba is a single platform, which it is, but it feels feature many different API products which we all support today, more or less, any web application with available documentation we can certainly support. And in this case, we’ve been able to support SAP Ariba across many of their largest customers by flowing Ariba data into Datasphere. What are other examples like Jeff said, we do support SAP SuccessFactors and SuccessFactors payroll and also Field Service Management, reporting and SAP Fieldglass. But don’t let what you see here limit your imagination, like we mentioned before, we offer over 2000 connectors today in any web application with API documentation, we can certainly connect to going back to the main purpose of the demonstration, I’m going to connect to SAP Ariba analytical, let me show you what that process looks like. I do have an existing source. This is what would be required to connect to a new source. Again, we always provide those explicit instructions as to how to procure these credentials. Now that I’m going to connect to an existing source, Precog has already done all the work for me.
Precog has received a JSON packet. It’s turned those into analytic ready data sets, and now I simply need to choose the data sets I want to flow into the SAP Datasphere. Let me choose one for the sake of the demonstration. Like Jeff said, we can. We can pull Custom Tables and custom fields that roll up to those tables, no additional configuration required. This is all standard in the platform. Now that I’ve chosen my desired data set, it’s time to send it to a destination. Let me show you how we can connect to Datasphere. Connecting to Datasphere is very straightforward. Not only do we provide the direct instructions, but all we usually need is a host name, a username and a password. Very straightforward here. Uh, similarly to my source, I’m going to connect to an existing destination as well. And in terms of the configuration aspect of our pipeline that we’ve just created together, that’s it. And so what do I mean by that is that we were able to procure credentials, plug them into Ariba, procure credentials, plug them into Datasphere, and then choose our desired data set. So now we’re just going to name the pipeline, and we’ll start beginning. We’ll begin to flow that data from Ariba into Datasphere.
As you can see here, we have begun staging the commodity, dim data. Let’s take a moment to walk through the flexible nature of the pipeline to ensure that you to show you how you can control the data that flows into Datasphere, into your destination from the top. With Precog, you can always edit the destination. Should there be a password issue, we’ll always inform you of that, and you can change that directly in the product, you may always want to add data sets. That makes perfect sense. This is a walk to run approach. You can always add those data sets by simply clicking here and then choosing your source of choice. So the pipeline is built to grow with your team’s data demands. Last but certainly not least, you can add a schedule. Precog is able to automate these loads, whether it’s going to be once a day overnight or more frequently, down to multiple times a day. The only thing that prevents us from loading any faster would be the API itself. So keep in mind the use case when connecting last, but certainly not least, as Jeff already mentioned, and I’ll iterate here, what Precog is doing in the background. It is automatically detecting the primary key.
It is automatically detecting the offset to ensure that after this historical load has completed, every subsequent load is just fresh data, ensuring that your customer is only pulling the freshest data when that data is required. Now that we’ve gone through the platform from connecting to a new source to reviewing that sources connect or loading that sources data into Datasphere. I’m just going to take one moment to show you what that data looks like in Datasphere. As you can see here, we were able to successfully import the commodity dim data and to generate a view on top of it. This is exactly what we would expect from the SAP Ariba data for customers looking to commit Analytics, you can use SAP analytics cloud to connect into your Datasphere, and from there, you can start using these dashboards to ensure that your company is hitting all the KPIs required as part of your project. So Precog is very much resolving that first mile of the data platform or of data movement. And that’s really how, with Precog, in a few minutes and a few clicks, you can start flowing data from any web application into Datasphere. Jeff, that’s my demonstration here. I’m happy to pass it back to you. Thanks for the time. Thanks.
Q&A
Jeff Carr (Precog)
All right, so how can you get started? So there’s a number of different ways that you can get started. These, the solution, Precog solution, is available on the SAP Store. So those of you that are familiar with that or use the SAP Store, you can go there and buy it at any time it is available through a newer program in Europe called buy now. You can connect with our team to schedule use case review. So the contacts, in addition to myself or everyone listed there, Mike corvisero, Sammy Haddad, who just did the demonstration, and Gary reindeller, who’s our SAP program manager. The solution, as we just went through in the last 30 minutes, allows you to connect to any web based application, whether it’s pre SAP or non SAP application, ingest that data from the application API into your Datasphere platform as completely analytics-ready tables and begin using tools like SAP analytics cloud and other tools on top of it immediately for purposes of analytics, machine learning and AI. It’s a no-code solution, as we mentioned, as you just saw in sammy’s demonstration, and in many cases, with applications like Ariba and success factors and field glass, we can have that data flowing into your Datasphere, data warehouse in or data platform in a matter of minutes or even just a few hours. So that is the webinar today. We will open it up for Q and A and see what questions have been posted during the time we’ve been speaking.
What SAP sources does Precog support?
Jeff Carr (Precog)
All right, so this is Precog. This is the platform we were just in. And you can see as we go through, these are all the available SAP integrations today. So you can see there’s a number for SAP Ariba, because it provides multiple APIs. But there’s also SAP by design, SAP Business One, business one reports as a service, Cloud for Customer commerce, cloud concur, contract, workspace, digital manufacturing, employee central payroll, warehouse management, field service management, field glass, integrated reporting for supply chain, Hana on asset management, finance, human resources, manufacturing, sales, service, procurement, supply chain, signavio, SuccessFactors, SuccessFactors, payroll. So those are the available applications today, which is pretty much the entire suite of various SAP family of applications. And then, of course, we can add new ones as they are created or become part of the SAP platform quite easily. Other questions.
My customer is running in Azure, in UAE, and the government requires their data to be local — can Precog deploy to any Hyperscaler?
Jeff Carr (Precog)
Yes, absolutely. We can run in any hyperscaler in any region. So if that’s a requirement, and we have customers in the Middle East today, and have done deployments in the some of the unique geographies there that are, you know, government regulated,
Does Precog have any professional certificates?
Jeff Carr (Precog)
I’m not sure I completely understand the question. Yeah, I’m not sure I understand that question. We obviously work closely with SAP as a partner, and you know, they’re, you know, our solutions are, you know, certified within their environment.
Can Precog support tables with millions of rows and many changes daily?
Jeff Carr (Precog)
So keep in mind today, our joint customers with SAP are some of their largest customers in the world. Think global automotive manufacturers, global defense companies, global pharmaceutical companies, in some cases, you know, loading millions of millions of rows a day. The Precog architecture is cloud-based. It’s all Kubernetes-based. It auto-scales both vertically and horizontally and can load, you know, in some cases, terabytes an hour. So we’ve not run into any issues with scaling the platform to meet the needs of the largest customers.
How long has Precog been working with SAP customers?
Jeff Carr (Precog)
Precog has been working with SAP for about four years. So we’ve been actually working with SAP for quite a while, just some of the history. So our first integration was Ariba, which is why we have so many different unique flavors of it. And Ariba is one of the more challenging APIs to connect to anybody on this call that’s or webinar that’s done. It probably knows that, and we were able to achieve a high level of integration relatively quickly. And I think that’s where the relationship began. And since then, we’ve just continued to increase our partnership with them. And we’re, you know, one of their preferred suppliers in this area right now.
How does Precog work with Partners for deployments?
Jeff Carr (Precog)
With integration partners, we’ve done that the utility example that we gave in the customer use case, we worked closely with a global systems integrator to deliver the solution they you know, were the feet on the ground. For the customer, helped with things like use case specifications and technical details, and obviously we provided the platform and guidance around the API integrations. But we work closely with both regional, local and global system integrators and consultancies.
I have a customer using another data warehouse, and would like to move SAP data into there. Is that possible with Precog?
Jeff Carr (Precog)
We wouldn’t connect to the other data warehouse. Precog focuses on connecting to APIs. That is our AI is specifically built to connect with web-based APIs, so we do not connect directly to databases, for example, so we wouldn’t connect to another database, whether that’s a data warehouse or just an operational database. That’s not part of what our platform is designed to do.
Precog
I think it was meant to ask if Precog can help move data from an SAP source into other data warehouses?
Jeff Carr (Precog)
Oh, sure, yeah, of course, we do support other data warehouses. Obviously, with SAP, we partner closely around Datasphere. It’s an amazing platform, but we do have customers that are non SAP customers as well, that will load data into other popular data warehouses, and we support pretty much all of them.
What’s Precog’s standard time-to-insight?
Jeff Carr (Precog)
I mean, in many cases, we literally can start generating insights within 24 hours. You know, if you know if the use case is one of the applications shown here, whether that’s Ariba or business one or Cloud for Customer, or SAP commerce cloud or concur, we can connect to that application, start flowing data, typically the same day. And with, you know, the power of Datasphere and SAP analytics cloud, they can start generating insights, typically within 24 hours. Okay,
Where Precog can be accessed at?
Jeff Carr (Precog)
Thanks, Chris. And just while we’re waiting, I’ll just reiterate that the platform can be accessed at Precog dot cloud or EU dot Precog dot cloud, if you have access to any of the systems, both SAP and non SAP systems that we’ve talked about. You can test it. There is a trial available. Precog is only in English currently.
How long until Precog starts working after purchase?
Jeff Carr (Precog)
Again, typically, the platform can be up and running and loading data into Datasphere within the same day, assuming you have credentials for the application you’re connecting to, again, be it SAP or non SAP, you can start flowing data into Datasphere, typically the same day.
What security protocols are in place to ensure sensitive data pulled into Datasphere is protected?
Jeff Carr (Precog)
Look from a Precog perspective. Precog is SOC two, type two certified. We also are GDPR, HIPAA and CPAa certified. So we maintain all of the primary security certifications for cloud platforms. We have a security office. We conduct or complete security reviews for all of our largest clients. All of our security information is available through our security office once the data is flowed through Precog into Datasphere, then obviously that’s SAP security infrastructure, which they can speak to. Okay.
The next question is, how can you support expansion into the Kingdom of Saudi Arabia?
Jeff Carr (Precog)
So, similar to the question about UAE, earlier, we can run in any of the hyperscalers, so any of the approved hyperscalers in the Kingdom of Saudi Arabia. We can run inside of those. Typically, we can create that instance, you know, within a few days.
Can you flow your data to Calibri?
Jeff Carr (Precog)
Yeah, so that you wouldn’t necessarily flow the data into Colibri, you would put it into Datasphere. And then Colibri is a partner of SAP and SAP BTP and Datasphere as well. And then from there, you would point your caliber solution at Datasphere and be able to perform, you know, the data governance and cataloging functions and things like that on the data and Datasphere. So that’s actually a really good question. It shows the power of the partnerships that SAP has created. If you think about that data pipeline, step one is get the data into Datasphere. That’s what we’re here for, and that’s what we do. Once it’s in there, then you can use your additional tools like Colibri for things like data governance and Data cataloging directly on the Datasphere platform.
How does Precog support applications in artificial intelligence?
Jeff Carr (Precog)
The way to think of AI is that, you know, we’re all familiar with the notions of llms, and there’s about 22 or 23 public llms that people use, certainly chat, GPT and things of that nature. Right? In a corporate environment, you effectively have to create what amounts to a corporate LLM in order to implement any sort of long term AI strategy. And then what that consists of is gathering all of your corporate data into a single place, in this case, SAP datashere, in order to create that LLM that data needs to be vectorized, which is basically just a nice way of saying it needs to be relational, you know, in clean rows and columns, so that you can create those vectors that allow the llms to function correctly. And that’s what Precog does. You can’t, you can’t create AI applications directly from the object data that comes out of almost all of these APIs, whether that’s JSON or XML, that structure is incompatible with AI applications that you would want to create and any sort of LLM structure you would want to create. And so what we do is create that vectorized structure that goes into Datasphere, and then from there, building those applications on top of that becomes much more straightforward and doable.
Does the customer need to contract with Precog or SAP?
Jeff Carr (Precog)
So you do buy it directly from Precog. You can buy it through the SAP Store. So you can purchase it through the SAP Store, but it is still directly purchased from Precog. You cannot buy it directly from SAP today. They do have the buy now feature, which is a new program in Europe that allows you to also buy it through SAP, but it still flows back to Precog. Yeah.
Are you able to work with unstructured data and structured data together?
Jeff Carr (Precog)
Yes. you know, Precog doesn’t, you know, make any assumptions about the structure of the data that’s actually part of the AI that we’ve created. Is it we can essentially connect to and work with data of unknown structure. I’ll just say, be that structured or unstructured simultaneously, and take, take that data and create the, you know, the the downstream tables and data models that you need for, you know, analytics, machine learning. AI,
Can we pull data from CDs views in SAP Cloud?
Jeff Carr (Precog)
Sami, do you know that? I want to say yes, but I know that came up on a previous webinar or seminar.
Sami Haddad
Yeah, that’s correct. Jeff, we can’t pull for CDs views. Okay, yes,
What is Precog’s pricing model?
Jeff Carr (Precog)
So the pricing is in the SAP Store, but basically it’s based on it’s per application that we’re connecting to. It’s an annual subscription. In some cases, some of a lot of our SAP customers will do multi-year agreements, just to simplify it, but, but it’s just an annual subscription that the I think the baseline cost to connect to a single application API is about $15,000 per year, so little over $1,000 a month. And then there’s, as you scale that up, the cost per application goes down, but you can see the pricing in the SAP Store.
My customer has lots of custom fields and tables they have built over the years, and Precog automatically gather that data without programming?
Jeff Carr (Precog)
Yes, absolutely, that is a key, key feature of the Precog platform. And one of the reasons we’re doing so much work with SAP customization and SAP applications, all any of them, is extremely common and important and part of the value of the SAP platforms. And so we’ve built our technology in such a way that we can automatically detect Custom Fields and Custom Tables as we’re interrogating the data sets returned from the API. So our approach is very different from existing vendors, where you’re essentially building an integration by hand, manually coding to sort of some stock schema that might be in the documentation. We’re actually doing the opposite. We’re actually making the web request, getting the response, which is an object of data, and then interrogating that object and asking it effectively, what data do you have? And so the way that, the easy way to think of it in Precog is we really don’t differentiate between custom and non custom. It’s all just data, and we’re going to capture all of the fields that come back in that API response automatically. So that is a big part of the value, and in many cases, the reason we’re able to start flowing data so quickly is that we have no notion of needing customizations in order to capture all of a customer’s application data.
How does Precog handles APIs that change constantly?
Jeff Carr (Precog)
Sure, so, as I mentioned a few times in the webinar today, Precog is powered by AI. I know people hear that a lot today, but what we really just mean is automation, right? So many of the changes, things like authentication changes, or even schema changes, we detect those automatically, and we’ll adjust automatically in the platform, or we’ll send the message to the user, if it’s something they need to do on their end, they’ll get an automated message through the platform instructing them of what change they might need to make in order to continue to flow data. So while, as the as the questioner here points out, SAP, or SAP APIs are quite complex. They can change often. We have automated a great deal of the challenges there not 100% I don’t know if that’s ever going to be possible, but, but again, part of the reason we’re we have such a strong partnership with SAP is that we’ve been able to automate a lot of the changes that are common in APIs and make it easier for their customers to keep the data flowing.
Does Precog offer worldwide support?
Jeff Carr (Precog)
We do. Precog is a global company. We have people, I think, in 12 different countries right now, and we can offer global support to all of our customers. And indeed, do that today, we have SAP customers in, I think, almost every continent right now.
Does Precog support any transformation in the pipeline, or does that all happen in Datasphere?
Jeff Carr (Precog)
So let me explain how we think about transformation. There’s two transformations that have to occur. There’s taking the data from the API, which, again, is object data, either JSON, primarily, or occasionally XML, which is an object model. We transform that into a relational model, a clean, you know, relational table with primary keys identified and data types identified people on this call that are familiar with JSON and XML, that typing is either non existent or poor. So we’re typing the data so that you have, you know, analytics ready types. So we’re creating the structure of object to relational once it’s in data, and putting that in Datasphere if you want to build second and third-order data models on top of that views, if you will. Then you would do that in Datasphere using the embedded tools and Datasphere for data modeling and creating additional models and value. So it’s really a two step process. What we’re doing automatically is dealing with the structural problem. If you want to create a downstream model to support a particular use case. You would do that in Datasphere with the embedded Datasphere tools.
Does Precog supports OData?
Jeff Carr (Precog)
We do. Precog not only supports OData, we enhance OData so OData has limitations around things like the ability to do incremental loading or CDC. It’s not very good at that, whereas our implementation of OData supports that 100% so we, again, we support OData completely for all SAP applications, and actually enhance it and make it much better than it is, sort of out of the box.
I have some older systems that run on XML data sets. Can Precog support that?
Jeff Carr (Precog)
We absolutely support XML, you know, we support JSON, XML, graph, you know, all the different API connectivity. And data models out there.
Can you support my custom applications that our IT team has built?
Jeff Carr (Precog)
It’s a good question. Now again, the bar or the entry point for Precog is some sort of API that we have enough documentation to be able to understand how it functions. If a company has built a proprietary application or an internal application, and they’ve built an API with that, even if it’s a proprietary or internal API, then we can connect to that and we can load that data into your Datasphere platform. It’s not a problem, but it does need to have some sort of API that we can connect to, you know, make a web request and get some data back.
How do you work with SAP to jointly support customers?
Jeff Carr (Precog)
So we work closely with them. You know, obviously we manage the Precog part of the platform. It is a fully managed platform. You know, we, in many cases, will detect problems before the customer is even aware of it and fix it. But SAP is obviously still responsible for their applications. If there’s and we see this, there’ll be times where we’ll detect an issue with a particular integration that we’ve completed, and we’ll realize that it’s actually a problem with the API, potentially a bug or something that needs to be changed. And we’ll report that back to the SAP technical team that’s responsible for that product, and in most cases, they’ll be able to make that adjustment, you know, quickly.
How much data can you flow through Precog?
Jeff Carr (Precog)
As I mentioned earlier, Precog is Kubernetes based. It scales vertically and horizontally and is capable of processing terabytes of data an hour for any client. So the rate of the scalability issue, or any scalability issues with doing API integrations, is almost always going to be the rate limits of the API itself, not our infrastructure or the underlying application, is going to be the rate limits of the API. But again, we’ve had no problem supporting many of SAP’s largest clients.
Are you available in the SAP App Store?
Jeff Carr (Precog)
We are available in the SAP Store. Yes, available to purchase. You can’t again, you can’t use Precog through SAP directly. You have to use it through our platform, which is Precog dot cloud, or EU dot Precog dot cloud.
Do you have customers outside of North America and Europe?
Jeff Carr (Precog)
Oh, absolutely. We have customers again in almost every part of the world. We have customers specifically in Latin America, including South America and Central America. We have customers in Australia, APAC, certainly all over Europe, certainly all over the US. Yeah, I think we’re in every continent that I can think of. I’m thinking quickly. I’m not sure we have anybody in Africa today, but everywhere else for sure.
For non SAP apps that have some potential quirks. For example, NetSuite doesn’t do a great job of computing deletes. How do you handle these app specific nuances?
Jeff Carr (Precog)
So we have, and this is coming directly from more of our customers. We have the best NetSuite integration on the plan. It does things that no one else can do. So we so as the as the questioner here points out, all APIs are different. They vary from in terms of good, bad, difficult, we’ve been able to work around with our advanced technologies. Many of these limitations are specific to the questioner’s point about NetSuite. We handle the problem you just mentioned completely. We have an outstanding NetSuite integration. I encourage you to try it, but yeah, so we were able to work around a great number of the sort of challenges of APIs. That is why we built the technology.
What’s the best way to engage SAP rep or directly to Precog?
Jeff Carr (Precog)
You can certainly reach out directly to Precog with the example here I’ve got showing on the screen again. You can reach out to any of the people here on the Precog team. You’re welcome to reach out to your SAP account manager if that’s your preferred method, and they will connect with the Precog team and schedule a use case review or whatever is required.
Okay. Next question, can you pull data from SuccessFactors? LMS?
Jeff Carr (Precog)
Yes we can.
How does Precog differ from other competitors in the market?
Jeff Carr (Precog)
So we support a much broader suite of applications. There are no other companies in our market, in the ETL ELT market, to support the entire breadth of the SAP universe, things like Ariba and Emarsys and by design and all these things. But we also support a much broader suite of non SAP applications. Our competitors typically support two or 300 different cloud applications. We support over 2000 so and the other thing to recall is that if you if there’s one we don’t support, we will add that again within a matter of hours to days, our competition will try to and it might take weeks or months, and they may never achieve it because they’re hand coding these integrations, and that can be extremely slow and brittle. We don’t hand code them our AI literally creates this connection intelligently without human intervention.
Outro
Jeff Carr (Precog)
I’d like to thank everybody for attending today. We hope it’s helpful. Again. If you’d like to learn more about the solution or how you can work with it in your enterprise or your business, reach out to any of the people listed on the screen there, or reach out to your SAP account manager and have them connect with the Precog team. I’d like to thank everybody for attending, and we look forward to working with everybody. We’ll now complete the webinar.