We're nominated: Best Solution for Energy Trading | Click here to Vote for Us

Multimedia

Data, Data Everywhere

Dennis Carlson of Molecule speaks on a panel at Commodity Trading Week about data management in the energy and commodities trading space, and the role of multi-tenant cloud computing.

July 12th, 2022 | 57:07

Summary Keywords data, trade, systems, cloud, organizations, business, market, capture, commodity, analytics, traders, insights, challenge, infrastructure, opportunities, lifecycle, customers, vendor, model, reporting

Transcript

Howard WalperCommodities People

Cetin KarakusBP

Scott FeldmanVoxsmart

Dennis CarlsonMolecule

00:00

Howard Walper All right. Before we get started, I'm gonna tell the joke I told the room before you – you're very lucky. So, a woman's kayaking off the coast of Maine, and she disappears. Everyone expects the worst. And, a few days later, the worst is indeed confirmed. The Harbormaster shows up at her house. Her husband's there. He goes, "I got some bad news for you. But I got some good news. I got some great news." The husband goes, "What could possibly... Well, tell me what is this?"

"Well, unfortunately, we found your wife, and she didn't make it." Husband's devastated, of course. But the Harbormaster goes, "But, when we pulled her up, she had 12 of the best lobster we'd seen in years clinging to her. And, as the husband, you're entitled to half the catch. And the husband's horrified, of course. He goes, "what could possibly be the great news?" And, the Harbormaster goes, "We're pulling her up again tomorrow."

Alright, It's late in the afternoon, guys. You know, we want to make sure that we start everyone with a little bit of energy in this final session before margaritas and armadillos. So, it's a pleasure to welcome this session, "Data Data Everywhere." It's actually, if anyone knows where that line came from, there was an old poem, "The Rime of the Ancient Mariner." The line was, "Water, water everywhere and not a drop to drink." And, years ago, I had a data analyst on my team who used to say, "data to data everywhere and not a drop to think."

And, you know, as data has become more and more important, how we think about data is also more and more important. And, these guys all have been thinking about data quite a bit. So, let's kick in with some of the questions. We started off. And any of you feel free to... Well, first of all, let me introduce the panel, and why don't you guys introduce yourselves. Let us know who you are and what you do.

02:02

Cetin Karakus Hello, everyone. I'm Cetin Karakus. I run the Quantitative and Analytical solutions at BP. My background's in the derivative technology around the quantum technology building quantitative and quantum mental system for derivatives in trading in energy markets.

02:27

Scott Feldman Scott Feldman from Voxsmart. Long history in the post-trade processing infrastructure space, currently now at Voxsmart, working on surveillance products, data aggregation, data insights, and COO of North America.

02:47

Dennis Carlson I'm Dennis Carlson. I'm VP of Security and Operations for Molecule Software. But, I have 20 years of experience in data management and analytics supporting the commodity space. I've built reporting and analytic solutions, kind of, front to back throughout the commodity lifecycle.

03:07

Howard Walper Great. Well, thank you, folks. And, we start off with the first question just about general trends. What are some of the big trends we're seeing in data and digitalization?

Let's start with you.

03:18

Cetin Karakus Well, I mean, as you said, data is kind of increasing. So, because there are so many different ways of now capturing data with the digitalization. You have like sensor data, you have other, you know – the means which were unavailable in the past. Now, the having all streams of data coming the same time. The storage prices also going down with using cloud, scalable infrastructure. So, there's no problem with actually having the raw data. The real issue is making sense of it in a, kind of, a more, I will say, more kind of, integrated sense.

Because, you know, like, the holy grail of the systems will be like you have one system. Everything is there; everything kind of integrated with each other. So, you kind of, you know, you access the system to draw whatever insights you want. But in reality, you know, this couldn't be further from truth. You will always have some sort of a heterogeneous infrastructure, different systems, because of the business, like evolutions of the business, or how things are being done.

This kind of distributed, heterogeneous infrastructure is kind of a reality. And then, that kind of poses big challenges in terms of, you know, having this kind of a whole, coherent, integrated view. And, I think that's the, kind of, main challenge so we tried to build systems. You tried to kind of literally build, kind of, systems where everything is at the same time constantly changing and evolving and better, you know, increasingly higher pace. I think that also, that's the main challenge.

05:07

Scott Feldman I mean, we're seeing similar challenges, as well. What we're seeing, the number of sources of data continues to descend. And, that brings challenges around standardization and normalization.

Those are actually easier challenges to solve than what we're seeing, and some of the platforms we start to work with around distilling down that information or just drowning in data, right? And then, aggregating as much stuff as sources, data sources keep examining. It's cheaper to store the data. So, you can keep it around for longer, but it's been distilling down... mining for gold, right?

How do we extrapolate pull out all of that data to get key insights that are important to deliver on. So, those are some of the challenges that we see our clients could be trading and research and market data could be annotating sales calls and storing data in a CRM system.

Gaining insights and providing tools for clients to gain insights into that data is kind of key stakes we're trying to operate in right now.

06:10

Howard Walper Anything you want to add to that? Or?

06:13

Dennis Carlson Yeah, not to repeat a theme. I think what these two guys have said is absolutely right. There are persistent challenges in how to turn your data into valuable information. And, it's becoming even more challenging in a complex data environment, as more data points are available to us.

I think one of the opportunities we have in the technology landscape is, you're starting to see better technology on the integration side. It's more accessible; it's more affordable. And, you're starting to see a push towards the business, where tools are available to solve the semantic layer challenge. And, we can enable business analysts, front-office, back-office, mid-office, to really work with the data versus trying to bring that data together, figure out what it means and put sense to it.

07:19

Howard Walper Great. Well, thank you.

You know, Scott, you mentioned trading data. And, I think that's what... I think a lot of the people at this conference are really interested in, in particular. How is data currently being used in the trade lifecycle? I guess, where are the gaps or the breaks in the process that you guys think need to be addressed?

07:48

Scott Feldman I can take that one, absolutely.

So, one of the things we're seeing is the, kind of, flexion in trade data and communications around that trade data. And, how do we link these two together? The problem is always starting to get the trade data as a second source somewhere. You know, if it's from the lifecycle of a trade, it's technically stale, right, because warranty usually happens.

So, creating that beautiful record, and I want to talk distributed ledger and blockchain, whatever technology. That persistent data that is stored and continually enriched and validated, reducing the number of reconciliation and customers that need to imagine done with that data is a key trend. It's definitely worked on pensions.

One of the interesting things we're doing with trade data right now is forensic reconstruction of transactions and the history of that transaction. So, being able to take a trade that's been executed and done and potentially even clear, at the request of the regulator.

To take a forensic look at the history of that transaction or the lifecycle events that have happened – trade events, trade surveillance events, communications, whether they're chats, calls – and then, link them all back together in a programmatic way, gives you that full auditable view of that transaction.

09:14

Howard Walper Any other comments to add to that, or?

09:17

Cetin Karakus I think, when I look at it from a system point of view, we have a lot of infrastructure around trade capture and, you know, the lifecycle, like whether it's an option, it's an exercise, or if there are dividends. You know, there's lots of systems.

But, there is a kind of a gap pre-trade, like how did you actually decide to make the trade? That's kind of linked to the Bolt analytics, and also all the supporting systems where, you know, for example, if you're in a physical commodity markets, energy markets, there's a lot of information you use to actually make the trade decisions. And that comes from the various systems so that you can't really... I mean, there aren't really, kind of, generic solutions for that because it's very much specific to the different businesses. And also, it's very valuable and proprietary. So, like, you know, no company will want to share that. But most of the information is actually there.

So, once, kind of, the trade is captured, we have, you know, loss of systems where, you know, we keep track of... I mean, the mere act of capturing deal is, you know – turned into a kind of a more standard form. And, most of the transactions are anyway, now, these days happening on the exchanges. So, there's a lot of information there. But the pre-trade part is where most of the gaps are.

10:41

Scott Feldman Quick question, too, while you're on the panel. Is it around communication that we don't get to trade with analytics and research? Because what we're seeing is some of our customers and clients are asking for is to be able to link that transaction back to all the chats and phone calls and emails. And, so that's something we're looking to solve or crack in progress, you know, and in the mine space and take home space across multiple desks at some of the tier one organizations.

11:09

Cetin Karakus Yeah, I mean, but that's basically how the business works, right. So, how you decided that trade requires we create a lot of pre-trade analytics tools, where traders use to gauge the market, you know, to understand, like, what strategy might work. There's lots of tools, obviously. They communicate with each other. They are in communication with the brokers and other market participants.

So, capturing all that in one season will be like a holy grail. If you do that. That's like, I think that's good. That's like an end result of any analytics system. So, the better you do that, the more successful you will be, because you will make really good trading decisions. And that's kind of a, you know, kind of constant goal for us to how we can, you know, make it more and more coherent. I mean, even successful trading firms, I think that they don't have one system, it's just, like. very hard to capture all of that. But you know, it's a noble goal to try to do that.

12:05

Audience Member Can I ask a question?

12:06

Howard Walper Certainly.

12:08

Audience Member How do you capture the trade's chat...?

12:15

Howard Walper So, invoice or or typing chats?

12:18

Audience Member I know a trades' chat is very general.

12:21

Howard Walper I know Scott would want to weigh in on that one, right?

12:24

Scott Feldman Yeah. So, that's a whole business. That's a whole business, right?

So, we'll be talking about the standard chat platforms, like Uber, IceChat. Anything that feeds from them, from ICE or Uber, you know, have the permission of the firm themselves, who wants to analyze that data either from a surveillance perspective, from a streaming perspective, or in some missed opportunities, coverage matrix, product mentions.

So, largely set the quality of the articles that allows for the normalization and standardization that data can come in all that unstructured chat data. All that unstructured... the Holy Grail of unstructured data is voice right. So, we take voice data, we transcribe it. We run it through an NLP engine that identifies intense name entity recognition, sentiment and polarity. And, we can feed that data on to another platform – Tableau dashboard... Any of those types of functions or the add-ons to what we do with that data.

What we like to do is enable the normalization process to happen for all the industries using the same modular engines that are geared towards commodities trading, fixed-income trading, that specific model.

13:45

Howard Walper Did you want to... did you have any thoughts on how BP looks at some of this information?

13:50

Cetin Karakus I mean, I can't get into too much details, but we use some third party products. The voice capture is, I mean, capturing the text, the chat is the easiest thing, right? So, it's literally you put some, you know, interceptor layer. Whatever you type, you know, goes into the chat distribution engine, just get logs. So, that's kind of the easiest thing.

The voice is a bit more challenging, because, I mean, it's kind of, you know, it requires a lot more data to capture it. But, you know, you have systems to do that. And, the main challenge is obviously to kind of examine those records because it's kind of a community-intensive process. So, someone has to actually listen to tapes.

But, as Scott mentioned, now, we have a kind of new systems where it's kind of a real time NLP-based voice to text conversions taking place. So, once you've converted the voice into, kind of, a textual transcript, you can use the varies. Like you could do, kind of, a keyword search, or, you know, like other text-based processing. So, they make it much more easier. And, I think the goal is almost real-time monitoring. And obviously, when there's an issue, there's a trade dispute or there's some sort of a compliance, you know – let's say, complaint or whatever being raised – you can go and check the tapes. You can find out after the fact.

But, the idea is like whether you can do kind of a real-time monitoring anyway. You notice something where you can just immediately notify someone from compliance to take a – well maybe trading manager – to go and take an action. So, that's the... I think, where we kind of going with, with the developments in the technology.

15:35

Dennis Carlson Yeah, that's a good point. I would add that as the cloud, kind of, evolves and becomes more accessible and more secure, you're finding more and more sophisticated offerings in that space, in general. And, I think they're becoming more accessible as we go. Scott doesn't want to necessarily have me say this, I guess.

But, there's an interesting evolution of natural language processing going on that I think is going to be very compelling for the commodity space. And, I agree with Cetin, right, getting to that point where you can actually capture a trade, as it's articulated, is particularly compelling. And, one of the big drivers is, again, the regulatory demand. And, as quickly as you can, identifying, and even checking that the trade that is actually executed as what was agreed.

16:33

Howard Walper Great. You know, one of the earliest things we said in this panel was about adding value, right? It's, it's one thing to say, "Hey, we collect information because it may be, you know, maybe because we have to, maybe because it might protect us, or maybe because... we it's just good to do."

But, what are some of the challenges in bringing data together across a trading organization in a way that brings substantial value?

17:05

Cetin Karakus I guess, when you think about the value is like – revenue minus costs, right? So, either you have to increase the revenues, profits, or revenues, or you have to reduce the cost. So then, when you delve into it, there's, kind of, various types of things. Like, I guess as a trading organization, you're mainly focused on, obviously, the revenue side.

But you know, in a big organization, you have the risk, operations, some other functions which have to support the trading. They mainly focus on the cost side. When I say cost, that also includes the, obviously, compliance, regulatory, legal exposure. So, that's also cost, right? If things go wrong, so they kind of focus on that. On the revenue generation side, it's... the idea is like, you know, how can you put, you know, profitable trades in a kind of a consistent high volume way. And, the data has a lot of values.

Without data, you will be like us, we will be like flying blind, right? You need all these gauges, all the data coming. So, you kind of make your assessment. So, how you put the... how you gauge the market, and how you, you know, put new strategies, and how you kind of change your hedge positions. All that depends on data.

And, and it's very kind of important, especially in the commodity markets, which are very physically driven, right. It's like, based on ultimately supply demand factors. And we have, like, you know, 10s of 1000s of different commodities, and 100s of 1000s of different markets, like different specific locations. So, if you kind of do this, well, then you could literally, kind of, turn all the discrepancies between different locations into, kind of, a very profitable physical arbitrage business. So, you don't really need to know exactly what's gonna happen in a specific commodity. As long as you know that the, kind of, the... how they kind of compare from one location to another.

It becomes like a business, really. You just take something, you know, story transported, and then sell it another location and do this, you know, they up their day. So, that's only possible if you have to system's infrastructure data to allow you to do that.

19:21

Dennis Carlson Yeah, I absolutely agree with that. I think what we're seeing businesses' demand of commodity data is, in particular, upstream. And, we've touched on this a little bit, right? That being able to model markets is compelling when it comes to identifying immediate and future opportunities. And, I think as we solve the problem of conformance of data, which exists everywhere mind you, and it's not a trivial problem. But if you're able to solve that, even within your own business, you're then able to run those analytics that tell you, are your traders actually performing in the way you expect them to perform? Have you missed opportunities? Have you met regulatory requirements? And, you can do that dynamically.

One interesting conversation that's out there is, with the rise and democratization of artificial intelligence, I think over the next couple of years, you're going to see that move from more an academic pursuit, to value added pursuit. And, you can even start identifying and suggesting opportunities either for trading, for scheduling by ingesting that volume of data and applying certain metrics and analytics against it that can actually prompt your traders or your schedulers to opportunities. And, I don't think we ever take the human out of it, right? These people are professionals for a reason, and you all have experience. But, I do think as humans, we miss opportunities that systems can help identify throughout the business stack.

21:04

Scott Feldman Certain keywords that we hear frequently, like missed opportunities... All of a sudden, whatever, that trade wasn't working or what she was looking at – let's kind of put off to the side.

I think one thing is important to understand is the timeliness of data is paramount to extrapolating the most value out of these products, right? So, there's an 800 pound gorilla in the chat space, and that doesn't mean at the end of the data, right? And so, the kindness of that data is very difficult to do get... shovel in history, actually, I don't get to see that even until the next morning, right? Things like that.

So voice... so even access to voice chat, voice conversation, in real-time. We've got the ability, a lot of services, guys, I believe, to do real-time transcription providing the services on top, and he got really actionable items that come out of this to the best opportunities presented to your desktop. In the next for... minutes after that should happen. Right? So, that's a visual cue. I still got to trade... I still got a confirmation I'm looking for. That's the opportunity we're trying to present.

22:17

Howard Walper Wonderful.

By the way, anybody who feels like jumping in with a question, we're a nice intimate group here. Here we go, we got one back here.

22:26

Audience Member Your comment about pre-trading, that actually really resonated with me. Like, every time someone in your career makes a decision... I'm not necessarily... of course I'm interested in whether we generate the PnL.

But, I'm more interested in, why did you make that trade? Like, what was the exposure, income analysis, or fundamental views? Or like, did you have some intuition, or did you think it was always that you could monetize. I do that personally, right?

I had tried to, like, whenever I could, try that trick to "Okay, this is a reason why we made that trade." But, do you do that systematically at BP? Like, trying to understand why...

23:04

Cetin Karakus Wow, can't really comment on that. But, I don't think so. I mean, I don't know the exact... all what all our traders are doing. But, I'm not aware of any system which captures. And also, there's bit of, I guess, what you see is the traders are being reluctant to share that information. But, I think institution will be very useful thing to do actually, if you could somehow enforce that.

There are certain things, obviously... there are certain... based on the kind of strategies they are using, you could somehow understand why certain trades being put in place. For example, if you're hedging. You know, if you have an option book, and you might be, like, doing the delta hedging, right. So, the reason why you did put those trades are pretty clear.

But for some other things, it's like.. I think the only thing you can find usually is like, based on like, what is this deal? Which books are in it, like which strategy belongs to. From there, you can infer something, but I don't think, you know, like them in a more systematic way.

24:16

Audience Member ...I think being compliant, like, you know, having all the voicemail, for example. You guys mentioned it. Were you talking about that? Or, what other things?

24:27

Cetin Karakus Well, I was just talking more commercial, obviously. Like, compliance is very important. So like, for example – before the trades, you might have some chat communication with the brokers and other market participants. It's very important the sort of language you're using, right? So, we have that. That's the part which is very much heavily monitored.

So, if you like, you know, that there's kind of... market regulation, you can't be in collision with other participants, for example. To, you know, kind of push the prices or, you know like, manipulate the markets. So, that's kind of very much heavily monitored, or talking more about the commercial, or what was the commercial directives were, why you kind of taking these dispositions, etc.

That's kind of, I think, a complex process. The problem is, I think, is with the capturing that is... it's a very complex process, right. You might be getting some information from your system. You talk with your analysts. They have some views around how the markets are moving. And, you look at the market action, the market data, what's happening in the market, and then you have your own, maybe, intuition. And then, you put it through.

So capture all this into, kind of, a more digestible... I think it's gonna be difficult. But, you know, whatever you could do in that sense will be quite usable... to do, kind of, you know, postmortems and analysis.

25:55

Scott Feldman I think I can take it from here. So, this is essentially more arguable steps, right? And, I don't want to make too much of a false alarm clock. But, the trade reconstruction tool we have creates a timeline view – all the communication transactions in the post-trade and all the pre-trade, right.

So, if you were to overlay on top of our pre-trade, trade, post-trade timeline, some of the analytics that your traders were using, right – so maybe new data came out, right. Then, all of a sudden, you see a spike in a particular commodity or particular asset class, particular fixed-income, or whatever it may be.

It interpolates that on top of the trade lifecycle. It's now mapped out for you to kind of understand how all your traders interpreting the data, job research departments, changes. And, when the Fed comes out and speaks about something and makes a big rate change, whatever it may be, from that point on, you can overlay that kind of transaction history and communications to identify how many relations...

27:01

Howard Walper There's a question back there.

27:05

Audience Member So, we are seeing data, data everywhere. But, it's really hard to reconstruct from start to end and get inside jobs and find opportunities or missed opportunities. So, what are the pieces that we are missing? ... What is preventing us from getting there?

27:30

Cetin Karakus I mean, yeah, good questions.

I mean, the thing is, with the data, like, you could capture as much as you want, right. But, for example, like, if you look at our bodies, we could catch, you know, capture trillions of data points. But just having that data is suddenly not going to make you understand how all the metabolic processes and how the medicine works, right. That requires its own research. I think the problem is with the... we are capturing data but just capturing is not going to give you the insights.

So, you need to have a kind of a more fundamental approach. Like, you need to understand your business. You need to have a way to model the data. I mean, my personal opinion is, like, you need to have... integration is a key thing because the data is going be coming from the different operational business systems, and that's reality. So, you're not going to care, like, "Oh, yeah, we're gonna just... I'm going to deal with one system."

I mean, Scott, for example, talking about his products. We have, like, other vendors coming with different systems. Great product but there will be another product; there'll be another system. It will always be coming from different parts of the organization. Being able to interrelate those data, that's where you see. Get a big picture of what is happening. And that's what gives you insight. And that's only possible if you do that in a, kind of, a more bottom up approach. Where, for example, you have some sort of enterprise data dictionary where you have to map out the concepts which drive your business, right.

And then, you link that to the various system instead of doing the other way around, like "I have this system, this system, how can I, you know, integrate them" That's become a more like Patchwork. You just may be following data from one side and other way you, but you don't really get it fundamentally. You might generate a lot of reports and visualization, but you don't really get a lot of insights on what is happening. But, if you have a more, kind of, a bottom up ground approach, where you kind of go from the principles and concepts and then say, "Okay, I have this system that might be data coming from this system, I can map to that." And that's not an easy solution, right?

There's no quick fix for that. You can't just buy a vendor's product or put some, you know, servers on AWS and suddenly, you're not going to... You need to have like, you know, design this thing and get this completely adopted by your organization. And then, have everyone to contribute that, you know, core system because it's going to evolve. So, it's not gonna be like, you know, done.

And then you know, we're going to use this for the rest of our lives. It's something that constantly evolves. So, none of these are easy things. It requires commitment and investment. I think that's why, you know, it doesn't happen very quickly.

30:23

Audience Member [inaudible]

30:28

Dennis Carlson Yes, the money and tears that have been shed over data management. I think, in reality, it depends on the organization that you're dealing with. But, it also depends on how the business is driving priority and what's important, right? And, the technology is allowing us to get faster and more affordable in bringing data together and conforming it in a way that's really valuable. Right?

Your businesses are starting to realize their data is actually an asset. And, organizations that think of their data as an asset have a lot of opportunity because then it becomes something you want to invest in. Right. The one biggest challenge in recent history has been just the cost of standing up servers bringing in data, doing conformance, conforming master data across an organization, and so on. Again, there's the technologies that are evolving in the last two, three years, really helped compress that scale and cost and can make things more active, right?

So, data lake is a concept that, I think in the data world we say is maybe less than successful in and of itself. But, the idea of bringing your data together quickly, analyzing what's available, and then starting to drive analytics and reporting from the business perspective is essential, in my mind. And, this is what we're hearing from our customers, right? So, this conversation is a very similar theme across most of the organizations that Molecule works with, right? They want all of their data, not just their ETRM data. They want market data. They want middle-office data. They want back-office data. But, they don't necessarily want to go out there and rebuild their entire infrastructure. Right.

So, that creates those opportunities to bring that data together to extract that value. So, I think, again, the classic challenges of this are, what, how do you put context to your data, right. You can talk to two different trading organizations, and they use different words for different things. So, there's even a lack of consistency across the vertical that we all live in. But, solving that problem within an organization can also be very challenging. So, contextualizing your data is key to understanding what you have. It's key to relating it across your systems and across your lifecycle. And, once you have that understood, you can then start doing all kinds of interesting things with your data. You can start doing the things that analysts want to do, right. Extracting value building reports, running analytics, running models on top of it.

I think it is still a challenge that organizations try to get to that result before their data is sufficiently contextualized and conformed, right, and that ... thus, the challenge of having this massive volume of data.

33:54

Cetin Karakus I will add that I think there is a gap in terms of having technology to enable, kind of, a semantic integration and dynamic, you know, lifecycle of those semantic concepts to evolve. There's definitely... I think there's a gap in terms of technology. Like, we have... technology to store data. You have, you know general reports and visualization from different systems by integrating the kind of various data from different places.

There's kind of a technological gap. But, there's also a big part of the being able to do as an organizational. So, you need to open up the data to, assuming that you have such a system, you have to open up to the whole organization because data touches everyone. Like you know, it's everyone in the organization will be contributing to the data. So, you need to have something like a Wikipedia type of model where everyone can contribute to it and kind of evolves. So, if you do that, then you're going to have a really kind of a holistic view of what's happening and then...

35:00

Scott Feldman Yeah, I mean the only thing I'll add, as vendors in this space, the good corporate citizens... were divided into hardware quality, interoperable. And, we should be able to interoperate with each other in a seamless way that decreases the challenge of BP and most clients' have, right? It's your neighbor, do you want to interoperate with larger systems together? Clearly API's and an ecosystem that allows for this to happen. It's really beneficial to the vendor, community, as well as the client.

35:38

Howard Walper Is there any other questions from the floor before we move on?

Alright, earlier, you mentioned capture something for compliance, right? What are some of the regulatory requirements right now that are driving some of the data storage and or organizational framework for data?

36:03

Cetin Karakus I'm not that familiar with the specifics because I'm not more focused on the front office. But, I know that there are requirements about the position reporting, after Dodd Frank, you know. There's more requirements around the exposure and your positions. So, there's probably some drive over there. And, what else could be – I mean we deal with customers and etc, it was a part of the other onboarding processes. So you know, we don't deal with just anybody; the customer has to be first kind of validated and kind of on boarded. So, there's quite a bit of data capture on that side.

And also on the most of the while, there's kind of a difference between the kind of exchange-traded deal flows and the OTC exchange part is pretty much kind of more safe, I guess. Because you know, you rely on the exchange. It's kind of a big intermediary, and OTC relies more on the essentially credit risk, and, you know, the kind of having that customer knowledge. So, there's probably, kind of, reporting being done on that side.

37:24

Scott Feldman There's definitely more on regulatory compliance that relates to... So, the things you said are absolutely accurate, right? It's combining scenes together. What we are seeing and is the design to applied to the same set of workflow and margin across all right. So, what you can't do is lay a trade off contract. It's a very complicated process, where traders are allowed to communicate and how they're allowed to communicate. They're going to go wherever their compliant to communicate.

So, providing a compliant solution and community communication channels is an important thing in an aggregating safe spot. So, for the purposes of a message forum or a message on WhatsApp, a call from their mobile phone. And, here we are at the same level of compliance, regardless of the kind of people.

38:21

Howard Walper Wonderful, we'll talk about the cloud. What are some of the advantages and features of cloud offerings? And, how do they stack up against on-premise storage?

38:37

Cetin Karakus Yeah, I mean, what cloud provides is like – instead of – we used to have our big data centers, where we had to manage the server, server farms, infrastructure. So, instead of doing that, now we're using AWS or Azure, and likes of such offerings. So, you don't have to manage the physical infrastructure. And obviously, the people who have to manage that infrastructure, obviously, you don't have to, necessarily, employ that many people to do that.

And on top of that, it provides you some, you know, compute and storage, elasticity or flexibility, so you can scale up. Whereas in the past, if you manage your own data center, then you know, to use it becomes like a logistical planning. So, if you had to add new servers, you have to order that hardware. And you know, ETA for that will, might be, like six months. So, you won't be able to scale it up. Before six months, you know, when those servers are coming. So, that kind of gives you... so, by outsourcing that to the cloud vendors, you have that kind of flexibility.

But also, I think I see on kind of a financial side, it essentially, kind of, turns into a bit of a financial engineering. It turns a capital investment into kind of a more kind of a kind of a cost. So, I don't know, in short term, it might be, it might be more attractive for certain projects. But long term, obviously, that might have some implications. And then, the whole modern software stack, I think it's kind of evolving towards the cloud now, as well.

So, whatever you're getting, all solutions are based on cloud. So it's like, even if you want, you don't want to use the cloud-based solution, you probably have to end up creating, like, internal cloud because the technology stacks are now kind of very much tuned for the cloud. I think overall, it's kind of a positive, positive development.

40:34

Dennis Carlson Yeah, to add to that a little bit, I think one of the things the cloud is providing for various size organizations is a relatively high-level of data security. So, it's a progressive concern that I think, in the trading and commodity business, you want to make sure your data is secure. The people who trade with you want to make sure your data is secure. And, the cloud allows access to security features that are cutting edge, that would be extremely expensive, in some cases, and prohibitive to do on-prem.

The other thing that I think is great about the cloud, when it comes to the commodity business, is it allows you to be very agile. You are able to spin up – infrastructure has been set relatively quickly. You can test platforms. You can test models. You can run analytics against your data as an experiment. And it's relatively low cost, low risk. And if it's not what you're looking for, you can shut that down and move so on.

And, in an environment of volatility and an environment of changing marketplaces for any number of reasons, models need to change rapidly. Organizations want to test their positions. They want to test margin possibilities and outcomes. And, the cloud allows you to do that at a pace that we've never had before on-prem. So, I think the flexibility that the cloud brings is going to be progressively more valuable to the commodity trading space and organizations there.

42:17

Scott Feldman The... auto scaling ability allows you to save money, even when you're under low volume and handling capacity, you know, almost without a ceiling. And, there's nothing worse than having a bunch of services that aren't doing anything you've already paid for them. Right. So, the cloud gives an opportunity to kind of use a paper with limited overhead.

And, certainly the flexibility to be able to deploy – you're creating a cloud-native solution that also is deployable on-premise mentioned is something that... for all kinds companies understates dealing with, right. Because everyone's got their preferred flavor of clouds, GCP, or AWS, or any cloud at all. So, you have a solution that's available and deployable using Docker containers.

43:15

Howard Walper Sorry, I have an alarm clock that goes off every day, just to make sure I'm still, you know, alert. Sorry about that.

So, we have time for one more before – any questions before... There's one question back there.

43:30

Audience Member That was a good point. I think previously, like you would bring in data into analysis. But now, I feel like you do the analysis where the data resides. So, you push the model towards where data is. But I sense a little bit of friction, right. People are either uncomfortable, or they think it is very, like how can you make that transition more seamless for your customers?

43:56

Dennis Carlson So, I don't think there's a single answer to that. I do think that the cloud offers flexibility and scale to bring data together. Now, we live in a space where there's always proprietary data, right? I see that breaking down somewhat because companies and entities, markets are starting to realize that data has value to the market and to their customers beyond just their own proprietary demands.

But, in some ways, that's part of the persistent problem of – we don't really have a common shared language in commodity trading, right. We understand what a trade is. We understand what a commodity is. But across organizations, and across vendors even, the way we capture that data can be very different. And, that's part of the challenge. I think the tools that are dynamic, semantic layers – and when I say that, I mean, there are progressively better tools out there that are able to put the model into memory, right?

So, you no longer have to move all your data around, but, you can actually build your semantic model – your conformance, in a memory layer, and then do your analytics and reporting off of that. Those have become much more sophisticated over the last five years and much more affordable. in part driven by the cloud, but you're also seeing a number of open source offerings that are quite robust, right?

And to be frank, they're coming out of the big data munging companies like Meta or Google. Those are the foundations for those models. And then, they've been commercialized to make them accessible across the data market in general. But, I think those tools are going to see more adoption.

I say this, in part, to then say this: I don't think any tool out there solves the conformance problem, right? That is always going to involve the business giving guidance and making decisions and telling the technical side of the house what's important and how they want to see that data.

Does that kind of answer your question?

46:35

Audience Member I mean, yes or no, but I just wanted to hear your perspective. On, one time, that enforces signal demand, trends, status, all things that many organizations have security concerns on a day-to-day basis.

47:00

Dennis Carlson Well played.

So, Molecule itself is actually multi-tenant. And, I think there are always organizations who are going to have concerns about that. I think we're a great case in point because we've proven that commodity trading organizations are able to leverage that multi-tenancy to their advantage. And what's the advantage? The advantage is, I'm not managing multiple branches for you and your competitor and, you know, whatever customer I have out there. I have uniformity of deployment.

Therefore, the advantage that I see in multi-tenancy is speed of releasing new features and commonality of features to the broader market. The – and you honed in on the challenge of multi-tenancy, right; it's around access security and data leakage. We handle that with physical and logical data models. And, you know, you've got to be rigorous. You've got to test, and we have very, very rigorous testing to ensure that we're not having any data contamination when we do a release. We make a release at Molecule, for example. At least every four weeks, sometimes more quickly.

So we're very agile, very containerized. And, we're able to move out new features quickly. But, we always go through rigorous testing. Security is the only answer to that. And, I wish I could tell you it was fool-proof. But, I also appreciate that there are some organizations out there that will never be comfortable with multi-tenancy.

48:53

Cetin Karakus I have something about the cloud. I think, when we look at the cloud, one thing it did is, like I said, it allowed... you don't have to manage the physical infrastructure. So, in that sense, not very much different than the – it doesn't really bring anything new. Like you used to have data centers, it's just on computers right now. It's on Amazon's data centers. So, there's no difference.

The main difference is with the cloud, now, the vendors – some of the vendors become actually the operators of the systems as well, right? So, you can actually buy a service or you previously you have to buy the, maybe, software from a vendor. You buy the hardware. You have to deploy it into people who knows how to deploy it and run it, you know, maintain that software. It might be like, you know, regular security patches, you know. This might be database might be full. You need to take backups. There's a lot of operations involved in actually using a system. So, now with you know, what cloud does is like, you could just get all of that as a kind of a service.

You just use it. You don't really care about how this thing is going to scale, or how does the storage... the backups are done, how the security is done. So, you literally kind of focus on what you need out of the software. I think that's the – especially for the small medium organization – that's a great service because then you just basically pay the subscription fee, and then you use that service. You don't have to worry about how these things work and, you know, get all the people to can manage that.

Obviously, for bigger organizations, things might be different. But even then, you know, for certain systems, that it's not their core systems, they prefer to use that approach.

50:40

Dennis Carlson If that's preventing you from having a further conversation with Molecule, for the right price, we're happy to do single-tenancy for you.

50:48

Howard Walper By the way, Dennis, thank you for using the word data munging. I really love that phrase. So yeah, data munging.

One final question we have time for, oh, what does the audience think? There you go.

51:00

Audience Member So, I totally agree that typical SaaS or proper multi-tenant SaaS platforms add the value of uniformity, the actual single branch. But, that's really more of a single branch function, that the classic on-prem or single-tenant cloud path or whatever you want to call that, and you're going to rely on it.

So, I get many different types of deployment platforms. But, that doesn't – that's not SaaS. That's a single branch benefit. So, my question is, for a VP, who I mean, I can remember back in like the early 2000s, you know, some of the data warehouse projects that were going on here. The amount of money that's able to be spent on that is massive, with the large companies. What's being done right now, for a small company, you know. A five person trading shop that also needs a data solution. How do they do that?

52:17

Dennis Carlson So, I think they have a couple of options. One, they can certainly take advantage of some of the popular cloud offerings that are out there. And again, at the risk of plugging my own company, we're rolling out a feature that's a curated data fabric, for that very reason. And that is single-tenancy that allows reporting and analytics. And, it's been really driven by the request and demand from our customers that they have other data outside of our ETRM that they want to put together in a sensible way so that they can run more sophisticated analytics and reporting, right.

They want to bring in market data. They want to bring in mid-office data. They want to do a little more regulatory insight or reporting. And, they don't have the IT infrastructure. They don't have the breadth of staff to build that on their own. So, our offering says, "Hey, if you're on Molecule, we'll bring all your trading data into that all your ETRM data that you can report on in any way you want."

Plus, we'll allow you to bring in other data from your environment or the market that you have licensed to. And, we will curate your data model, we will push that semantic layer out. We will manage it and maintain it over time, so that you as a smaller company don't have to worry about everything behind your data model. Right, we'll do our very best to bring together all that data."

And obviously, it requires work with the company. But, we're doing all that heavy lifting. And once it's there, we manage that over time, and free that company of five people or ten people, or whatever the case is to run the analytics. And, they can use our tools to do the analytics, or we're entirely agnostic at that point. So, that's one solution.

I think the other solution is the rapid development, and it's a fair experimentation in the cloud. Again, it's brought the cost down effectively. And, there are plenty of tools that you can tap into with a bit of technical know-how, right. I think the other thing that's interesting in general is that we are kind of seeing that democratization of tools, right? That are going to demystify some of the technical back-end and really enable, kind of, the citizen developer, if you will.

54:49

Scott Feldman So, I'll leave you with like my one comment on this is issues. It's essentially there's just no one size fits all solution, right? And you, as a vendor in the space, you know, we've got SaaS clients. We've got on-prem clients. We've got combinations. But, that's still that hybrid focus back to some of these newer hybrid six years to come away with the show gold and platinum starting with a... it's going to be there for a while because there's parts of the data model that are dealing with clients that do not want to disclose to the cloud. We got to be able to teach along with that data, seamlessly.

Our SaaS platform or cloud-based solution is to be able to access that data safely, securely, encrypted. Store memory before we had already decided to delete the data. We know something that's a one size fits all solution just doesn't exist. It's all about being flexible. Solve the problems that our clients have and not try to solve it for them. Right? Not give them all one option. We should be flexible.

55:53

Howard Walper Great. Just to be fair, Cet, do you have any last words here as well?

55:59

Cetin Karakus No, I mean, it's great to use in a SaaS solution, I think the only thing I would add is like there's kind of a could be like, vendor lock-in problem. So, I think the industry should address that by having open data standards. So, you know, at the mercy of one vendor, not just because of the potential malice, but you know, they might go out of business. For example, there might be some business implications which are out of your control. But, if you have like an open data standard, then at least you can move your data from one platform to another because data is going to be your most valuable asset.

You put all your, you know, curation all your, you know, kind of a business operation data there, and then suddenly that went out of business and why were you going to do? So, you need to be able to move your business to some other vendor, as long as that's the case then just choose whatever, you know, offering is kind of best suits your needs.

56:56

Howard Walper Wonderful. Well, with that, we're gonna say thank you to Dennis to Scott and Cetin. Thank you very much.

Transcribed by otter.ai
Get a Demo