Multimedia

Implementing New E/CTRM Software? How to Increase ROI + Speeed Up Time to Value

Get the guidance you need from our experts to ensure your next ETRM/CTRM system implementation runs as efficiently as possible.

February 29th, 2024 | 42:37

Summary KeywordsETRM software, ETRM implementation, ETRM/CTRM, ETRM systems, E/CTRM, training and management, change management, software implementation

Transcript

Tamasin FordCOO at Molecule — Speaker

Alex ChandyVP of Customer Success at Molecule — Speaker

Kari FosterVP of Marketing at Molecule — Moderator

00:00

KARI FOSTER No matter what stage of the implementation process you're in, our presenters have some wisdom to impart today. I am very excited to introduce Tamasin Ford and Alex Chandy, our COO and our VP of Customer Success here at Molecule. Welcome to you both.

00:32

TAMASIN FORD Thanks so much, Kari. I'd like to personally thank everyone who's taking time out of their busy day to join us today. Alex and I, in our roles, are deeply involved in implementations here at Molecule. We work across a range of commodities and customer types or business models. And between the two of us, we've had decades of seeing the good, the bad, and the ugly. So we're here to share war stories, talk about our go-to approaches, and ideas that we have about making ETRM implementations work and specifically making them stick. Adoption is one of the go-to words, and really focusing from a change management perspective today. It's not to say that we have all the answers. I'm sure that many of you here have been through some of these challenges yourself and emerged stronger on the other side and have stories to share your own. But as we acknowledge, we learn this through experience, and so hearing other people's experiences is often a very valuable way to prepare yourself and gain lenses or frameworks that you can apply. We can all agree that people naturally dislike change. So if this was a story, the name of the villain would be Resistance.

01:52

I mean, if you look at this slide, how many times have you heard someone say this? Or really, say something similar that implies this feeling: "We've always done it this way," and there's so much subtext to it. "You're disrupting things," "You're making life harder on me," "Why can't you just leave me alone to do my job? I know what I'm doing, and you might not know all the details," "Things are just fine the way they are at the moment," "I'm OK with a little bit of manual work as long as I know how it all goes," and on and on. There's a whole pile of reasons, rational and emotional, that come up that cause this resistance. And it makes sense. I mean, change is the enemy of comfort. It causes a sharp increase in cognitive load, because now as a user, you have to think. Where previously you were on autopilot, you have to learn something new. You have to do something new. It represents unfamiliar territory. Here be dragons, right? Unknown challenges, risks. And then your old tools and approaches may not work in the new environment. You may actually be losing some things, and you don't know what you gain in return yet at the beginning.

03:05

So at the very least, we're asking the users that they're going to have to try. They're going to have to adapt, they're going to have to make all of these things that are disruptive for them for the good of something for them, hopefully, and for the entire company is going to be beneficial in the long term. But in order for that to happen, we have to bring them along.

03:29

ALEX CHANDY And, Tasmasin, and I guess the other context to add to that is this is all done while they still have to do their day jobs, right? And they already have deadlines. They have processes that need to be run, dependencies that other people have other departments, other workflow. And so this is an extra burden on top of everything that they already need to do because the business doesn't stop just because you're implementing new software.

03:55

TAMASIN FORD Yeah, 100%. So Alex, do you have some ways that you think about in terms of frameworks or ways to sort of structure how you work with people when they are this mess of emotions and resistance?

04:12

ALEX CHANDY Yeah, I think it all boils down to the fact that you kind of have to turn it on its other, on its axis, the whole aspect of change, and think of it from the perspective of what is the goal that we're after here? And the goal we're after here is really around, one, around building your capabilities, making it easier for you to do the job, and allowing you the head space to actually focus on how you're improving the work that you do, as opposed to managing the work that you do, right? At the end of the day, one of the aspects of putting in new software, putting in new applications, is really around getting some of the drudge and the work out of it and allowing the user more head space to really focus on the actual tasks that need to be done, as opposed to managing the data or managing spreadsheets or managing processes. And I kind of feel that that's the exciting part of all of this. And that's what kind of gets lost because we tend to focus immediately on, what are the processes and what is the technology? But really, at the end of the day, those two things are the enablers. What the real outcome that we're looking at is, how do I make things better for you and how do I allow you to bring the benefit of the experience that you have doing this work and actually be able to innovate and do it better and become a leader and show people and show your department and show your peers better ways of doing things because you're not mired in the muck.

05:54

TAMASIN FORD Yeah. Essentially, what I'm hearing from you there is that inspiration and vision is one of the best antidotes to that fear and resistance.

06:04

ALEX CHANDY Exactly. When you think of it in that aspect, it's really around setting in place a strategy that puts those new capabilities front and foremost. That's what we're going to talk about here. We distilled it into three general categories, which is education, persuasion, and listening. Each one of these will probably be a webinar all by themselves. We're going to distill our learnings and our experience around all of them. But these are three maybe often overlooked because they fall under change management. They fall under your typical software deployment methodologies, talk about people and process and technology. Everyone jumps on to process and technology because they're very defined. And people is this amorphous term. It doesn't really mean anything. It's like, well, you have to train people. You have to document stuff for people. But really, the goal there is, how do I enable my users to do more or do more value add compared to what they do today? One of the things I always fall back on is the golden rule of change. It's like the golden rule itself: do unto others as you would have them do unto you. It's manage the change the way you would want it managed.

07:27

And to go back to to Kari's theme of quotes, it's don't fall into the classic strategic blunder. And if you were to quote, the first being, don't get into a land war in Asia, followed by, don't go in against a Sicilian when death is on the line. And the third, less well-known one is, don't outsource your change management to time. Don't stick it at the end. Don't call it training and documentation, but actually engage in it starting early and make it around education, persuasion, and listening.

08:04

TAMASIN FORD Yeah, 100%. Why don't we dive into these a little bit and we can discuss along the way and see what other gems we can bring out? I guess starting on the education side, I myself have had some experiences in the past with training and development, and I really believe that the key ideas that you can use to make training effective are known to us all, and we just kind of forget them in the heat of the moment of trying to do all this other technology stuff and to manage the data and to get all the calculations right and so on. The research for modern psychology and education research is really clear. You can find it with a 30-second search on YouTube. I'm just going to pick one example that as a concept that, again, as soon as I say it, you're going to recognize it. It's called spaced repetition. And so the idea is that just as you're starting to forget something, you want to remind them, again. It leads to better long-term retention of the knowledge. And we all intuitively know that people don't effectively absorb large amounts of information all in a single go.

09:24

And so that typical end-of-project training schedule may feel really efficient to everyone. But in terms of supporting how people really learn new information or skills, it's very much in a misalignment. And you've probably felt this yourself if you tried to learn anything new recently, whether that's a lecture, a video, or an article, the main points at the beginning land strongly, and then it's like your brain gradually gets all filled up. And by the end, it's hard to even concentrate. And a week later, forget it. You've forgotten most of it. If there's no follow-up, it's likely you literally haven't learned a single thing.

10:00

So when I'm doing an implementation, I actually start to incorporate many trainings right inside of regular weekly working sessions, pretty much as soon as there's anything meaningful to do. So I'm going to be showing my customers, my users, how to go into the system and do something that is specific and targeted. An example in Molecule could be showing them how to filter trades and find the ones they want, or how to add a new column to the view to get the information onto the screen that they need. At the same time, everybody's learning what the screens look like, how to navigate. And that's all easy for me, but it's foreign to them, and so just seeing it over and over again starts to reduce that unfamiliarity. I don't know about you, Alex, but I always follow the assumption that there's someone new on the call or someone who has forgotten how it works, and so I'm recapping on a constant basis.

10:58

ALEX CHANDY Yeah. And the thing about education is it's a two-way street, right? So hands-on training compounds on itself, and actually people will start to reveal how they do their work. And that's education back to the project team. Because one of the things people don't realize is sometimes the project team and the end user, there's a credibility gap. If you go back to, this is the way we've always done it, is actually a symptom of you don't really understand what I do. And I do things, I do different things depending on the context, the time of the month. I have different tasks, what time frame I'm working on. Sometimes I may be doing stuff in the current month. I'm actualizing physicals or I'm doing something in the forward month where I'm putting in forecasts or projections. And that is a great way this hands-on education starts breaking down that credibility gap that you have between the user that says, you don't really understand what you're doing. The project team really understanding, this is what the user does in the context of time and the context of the task.

12:03

TAMASIN FORD Exactly. It gets it from the theoretical into what's practical, right? I really love the concept of the two-way street. To me, the way I would look at it is that you're interweaving education and requirements gathering. During a typical requirements gathering conversation, it might go something like, Oh, we need to lock our trade so that the traders can't go back in and change them without permission. I need to translate this into how it would be set up in a Molecule specifically for my implementation. So I could ask a bunch of questions and figure out how they want it to work and then go away and do my thing behind the curtain, or I could open up the system. I could put it up on the screen in a demo environment and show them the exact screen where the trade locking is configured and open up a trade and show them exactly what it looks like when it's locked and point out some of the settings and permissions and the impact they have. And so by taking that requirements gathering and turning it into an ad hoc, and here's how the system looks and here's how it works, again, we're getting those little, almost tidbits or snacks of training along the way, and we're talking to each other collaboratively and interactively instead of one-way communication and then one-way communication in each direction.

13:31

This is a classic resistance challenge that we're trying to overcome in an ETRM. You can pry my spreadsheets out of my cold, dead hands. And, yeah, I mean —

13:54

ALEX CHANDY Spreadsheets are the ultimate software application that is the nemesis of any package software, because spreadsheets give the user so much control over the data, over the reporting, over the formatting. But they also come with their cons, which is it's hard to have audit trails. You have to save off a spreadsheet every day. God help if you're trading something like real-time, you'd have to save off a spreadsheet every hour just to know what was happening. But it is a reality is they offer tremendous flexibility and they're a formidable, formidable application to go on, to go after, to get people to change from.

14:35

TAMASIN FORD Yeah, absolutely. I guess that's where we come to the persuasion. Because, what are some of the ways that we can make that case for change and not just educate them, but actually start to change hearts and minds, as they say.

14:51

ALEX CHANDY Yeah. And persuasion is, again, it's a very, very powerful psychological concept. If you think about things such as hypnosis, hypnosis is really about implanting persuasion into someone's mind so they think they're acting in a certain way while they've been told to act that way. But this is a different context. In this case, it's understanding what you're asking the change to be, and then tailoring your change strategy around that is incredibly important. So persuasion is the tactic that you use. And when I say persuasion, you may have small changes. I'm doing the same process. But now it's in a new GUI, right? And there your persuasion strategy is around documentation of training and support. The thing about the larger things, I'm putting in a piece of automation on something that was done manually before, or in the case of spreadsheets, I'm going from a complete spreadsheet environment to a automated system. These are huge changes. And here the persuasion strategy is one around building trust. And you build trust around improving concepts, doing things iteratively, starting smaller, getting the user to trust the system, not blackboxing automation, allowing them to look inside so they understand what's going on so they can feel comfortable even troubleshooting it.

16:17

And that concept is really around you are focusing on proving the concept to get the adoption as opposed to trying to look for an end outcome, which is how quickly can we get this to a scale by date to get the customer or the user completely comfortable. And then the benefits multiply because once they believe and they understand, you now have the testimonial, you may have the business outcomes as well that allow you to scale that. We have two customers currently, two radically different customers. We're a very large organization, small organization that are both transitioning from spreadsheets in very, very complex trading environments. And both of them naturally came to that conclusion as, "Hey, let's start with one or two projects. Let's get everyone used to using the application, gathering their feedback, making the configuration changes or the slight enhancement changes so that we can roll this out in a much faster and a much more acceptable way." So it's like slow down to go fast. And that is something that It's never one size fits all. Understand the change.

17:34

TAMASIN FORD The classic psychological concepts of persuasion. I mean, persuasion isn't fast talking. It's actually, it's real. It's doing. It's showing the social proof, it's allowing people to test it and prove it to themselves. And as you say, it's like that thin edge of a wedge where you get certain distance and they become the next turn of the flywheel or the fuel for the next go around.

18:07

ALEX CHANDY The third aspect, which, again, it's when people say, I have deadlines, right? I have deadlines, and I need to get this done by a deadline. That comes into the third aspect, which is listening, right? Are you really listening to what the user needs to get done?

18:28

TAMASIN FORD I think it's really tempting to think if you're educating and you're persuading, then you've done your homework, you understand it all, and so this is going to be a slam dunk. Of course, I'm giving them all the right arguments. I'm showing them the process. I'm doing it all by the book. But again, these are people, not machine cogs or something. They're going to respond the way that they respond. And the adoption long term is going to ultimately succeed or fail based on their actions. Not ours as the vendor for sure, and maybe not even the implementation team within the customer. I feel like the number one thing on listening is that it's very much like my argument on the training. Don't wait till the end. Don't do all of the requirements gathering and set up and then present it fait accompli, and say, "OK, so now you give feedback," when you really mean, "Well, if you give feedback, we're going to whine about it's too late and it was going to delay our go-live date." Really, there's no chance for feedback at that point. The listening tour, as they sometimes call it, starts on day one.

19:48

I would say one of the biggest things from the vendor side is I'm trying to make sure that my counterparts have brought all the right people in so that we're hearing that feedback from everybody, if there's an accounting clerk, if we're going into the back office, all the way through the middle office, and what do the risk managers think, and then the front office and what the traders are saying, because they all have such different needs and perspectives.

20:16

ALEX CHANDY Yeah. And then there's also two aspects of listening. There's the reactive listening. So you give people the opportunity, and then they tell you things, and then you react in that feedback loop. And then there's also proactive listening. So for example, having the vendor in our case, Molecule, we can share how are the customers using the data? What screens do they go to? Where do they get stuck? How long does it take them to get a particular task done? Sharing and being transparent around where are the bugs, where are the issues that they raise, and how quickly did we pick up on them and solve them? Because that builds credibility with the user as well. Being transparent around this, it's one thing to listen and then never do it. It just goes into a black hole. And the other thing is to always be transparent with what you heard and what actions you took.

21:08

TAMASIN FORD Yeah, 100%. I think that sometimes when we make project plans, we tend to have all of the big obvious stuff on them. And then they may get tweaked and updated around the margins, but we treat the issues and questions that come up as separate. And in some way, you need that issue list to be like equally important to your project plan, because over time, it's really the thing that if you're not addressing everything on that issue list, you're going to hear about it sooner or later. And if it's later, it might be really disrupting go-live plans and so on. So I think that this transparency around gathering user feedback and making sure, as you say, Alex, every single thing is addressed. And addressed doesn't mean that you just do what they say. Addressed means that you listen, that you ask the questions and follow up, that you interpret what is needed, that you give them the way the system is designed to do it, and find out whether it's going to work with maybe a little process change to use the system as intended or whether it requires a workaround or maybe even a feature or enhancement request.

22:24

ALEX CHANDY Yeah, and sometimes just the opportunity to talk it through gets the user to realize this is something that probably belongs in another system rather than making this system bend to do something unnaturally. Can you build me an integration or a report that would help me out? Sometimes that is the ask at the end of the day that you uncover rather than, can it do this?

22:46

TAMASIN FORD Yeah. And with that in mind, one of the things that I'm trying to do on my implementations is to communicate, especially to the leadership of the customer team on the project, here is the set of tools, the toolkit that I have available. So as you mentioned, sometimes doing a report could be a workaround because sometimes there will be the question like, "Well, can you stop a user from doing X?" And oftentimes my answer will be, "Well, rather than stopping them from doing it, because the system is just probably recording what they've already done in the real world anyway. Maybe instead, what we should be doing is surfacing it through some audit report so that you can address the fact that it's happening in the first place." And that's a tool that's at our disposal, and over time, they'll start to think that way, too. I have one more of these resistant slides.

23:54

ALEX CHANDY It's, again, back to the trust issue, isn't it? It's like, I don't trust. I don't trust unless I've done it myself, which is, I think we touched on the point a bit earlier, is try not to black box stuff, especially today where there's such a push on digitization and analytics and, God help, AI. This is going to be more of the, I don't trust what your box did. Educate them, enable them to understand how the system is doing something, how it thinks about it, so that they themselves can troubleshoot it. I like this bit of an analogy. It's not an ETRM analogy. I think about pilots in a modern airplane. A modern airplane is so highly computerized, it is so highly networked that it can literally fly, take off, and land all by itself with minimal guidance. Pilots themselves have to go through incredible training because what they are trained to do is understand why the plane is doing what it's doing. So they understand the mechanisms and the processes that are giving the inputs and therefore how the plane is reacting to those inputs. So they can troubleshoot. And that's what gives them the confidence to use an autopilot system, an autoland system, and know when they need to intervene or when they don't need to intervene.

25:16

Now, it's an extreme example because they have redundancies in an airplane, but it's the same thing. It's the trust to understand the process behind to where the person will then value the time over the accuracy or the precision that's coming out of the system, and that's where you want to get them to.

25:36

TAMASIN FORD All right. So I think we gathered up a few tips and lessons learned. A bunch of these have come up through our chat so far. But one of the things that we didn't really talk about is how manual processes can be, I guess, a yellow flag, I will say, maybe not a red flag, but a symptom of a deeper issue, and looking at, well, why do you have a manual process? And maybe why are you even preferring to keep it manual? What does it mean to you?

26:10

ALEX CHANDY Yeah, absolutely. A lot of times manual processes exist because there's an issue with the data. Either the provenance of the data or what is the most updated data that I can get. I can get better data than what a source is thinking. So I'm manually intervening. It may be a symptom of a nonfunctional requirement. You still haven't figured what they're doing and why they need to do it. So it requires deeper digs. And that's why, going back to understand what the users do in detail, if you have that luxury to do that prior to implementation, it helps tremendously. It helps tremendously with uncovering all these NFRs or data issues that you have.

26:52

TAMASIN FORD Absolutely. And the fourth one there about the continuous improvement and feedback loops. I know I mentioned the project plan before and how it has to be to be updated and supplemented with a live issue log. We know if you need to have an airplane or a spaceship, right, fly to the moon, then you're going to do waterfall software. You're going to plan it all in advance, and then you're going to do it all. But in the real world, this ability to cycle and loop and get feedback.

27:22

ALEX CHANDY Yeah, and this is what I alluded to in the beginning, is really the outcome around capability development that you want to get to. This is the prize that you're really looking for. The prize is not deploying an application within a certain budget or time frame, even though we know those are important things. What you're trying to do at the end of the day, and this is what the change process actually institutionalizes, is embedding the knowledge, building or tuning continuous improvement forecasts, because now you can track things at a much finer granularity. But I think the biggest prize, which is monetizing the first two things, is really allowing your users to identify innovation. It could be small innovation. We're not saying that they're going to change the way you run a refinery or a pipeline. But the ability for them to have the freedom to innovate in the way they work and make things, streamline things, because now they're thinking about their function versus working on an application, or God forbid, a spreadsheet, to just get what needs to be done for the day. You're allowing them the ability to actually innovate on their work.

28:39

And what comes with that is the responsibility for you to capture those things and champion them. That's the goal at the end of the day.

8:47

TAMASIN FORD Well, I think you've done a great job of bringing us back full circle and leaving a bit of inspiration there. So maybe now is a good time to open things up for questions.

28:57

KARI FOSTER Excellent. Well, thank you both so much. What a great discussion. And, as I put into the chat, if you have any questions for our presenters today, go ahead and put those in the chat for us. And let's kick things off with this question: Having been through many implementations, from your experience, what are your top do's and don'ts that everyone should know as they start an implementation? So thinking about what that one thing is that they should do or not do to make the implementation go smoother and more efficiently.

29:39

ALEX CHANDY I would start saying preparation is probably the best formula for success. If you have the ability to really understand and map out what your users do, and I'll give you an example. This is prior to Molecule, I was on a system implementation where they did have the budget to go and hire consultants to say, OK, what do our target users do and what do we want this application to enable? And they had beautiful SIPOC programs and workflows in level one through level five. But really, they were just taking a snapshot of what someone did in general. It's really understanding, OK, at the beginning of the month, the middle of the month, the end of the month, these are the things I do. It takes time and it takes effort. But if you can map that, that gives you such a leg up into the implementation process and into the fine-tuning of where do I want the system to enable? Where is it that I require accuracy? And where is it that I require precision? Because these things tend to be just smashed in, and then it becomes a race to the bottom to manage budget and time frames, and everything gets thrown out.

30:50

So that would be my first do. And it would also be my first don't, which is don't go into it blindly.

31:00

TAMASIN FORD Yeah, I think from my side, my main do would be to get the right people identified and involved and to have a plan for how they get brought in at various stages of the project. You need to have a balance. You don't want to have your back office team sitting through all the nitty-gritty of how traders are going to enter trades into the trade screen. But at the same time, you don't want to be leaving them to the end and finding out that they had something that the traders needed to be capturing at that moment. Otherwise, the accounting isn't being coded correctly. So knowing who are the people who know all these things and how do they get pulled in at the right moments is really critical. I did have one customer where rather than having a single person running the project, they ended up making it a team of three. They had a senior person from each of front office, middle office, and IT. And the three of them worked really well together. The front office person took a bit more of the lead, but it was really the three of them together that were like the executive committee, almost, on the project team.

32:23

Having the different perspectives planned for and incorporated early on meant that we weren't scrambling later on to say, Oh, well, what about the back office perspective?

32:39

KARI FOSTER Yeah, having that team of representatives. They're representing the needs of their particular part of the implementation. And that's really important what you bring up, Tamasin. And so you don't have all the wrong people on there at the wrong time. This one just came in from Johan: What strategies do you use for trade migration and reconciliation on an implementation?

33:11

TAMASIN FORD I guess the first thing is that it depends on the scope. I would probably immediately say, try to break that trade population apart into the relevant pieces and migrate and reconcile each piece because I think we've all been through it. The larger the data set, the harder things can be. And especially because if there is an error and say a field doesn't come across and it needs to be reloaded or something like that, then large populations are just simply more time-consuming than small populations for all of those things. So that would be number one. I think that number two is that as far as the migration goes, I come from a very Molecule-centric perspective. And so for Molecule, the trade loading is preceded by product set up. And you have to get the product right in order for the trade to come in and use it. A big part of it is spending the time with the product analysis such that the trades come in with all the, the attributes they needed. And then on the reconciliation side, so generally speaking, I think it's probably a lot of the standard strategies that you would imagine.

34:43

You're going to reconcile based on your quantities. You may be slice and dice by your tenors, by your products, essentially looking at it and saying, do all my Henry Hub trades in April 2024 add up to the right number that compares to the spreadsheet. But there is a trick that I also like, which is that if you take all the trades from your original load, which is often in the format of a spreadsheet, and then you take all the trades that are extracted from Molecule, and you change the sign on one of them, and you add them together, then you can actually reconcile by looking for ones that don't sum to zero. And then that can be a neat trick to find the idea, the places where things aren't matching.

35:40

ALEX CHANDY The other thing, Kari, is sometimes there's a decision, and it's not a light decision to be made at the beginning, and that's dependent on where your starting point is. So we have this concept of if you are ripping out and replacing another ETRM, sometimes does it make sense because of valuation approaches and you've agreed on a new standard, what point do you start forward? Do you go back years and replicate trades that are no longer active, or do you start with all your active trades? I think if you're coming from a spreadsheet regime where you were capturing all your obligations spreadsheets, you were doing all your mark-to-market in a spreadsheet, you have a little less from a flexibility perspective. You may need to go back in time to have a full history in now the system of record because most of the times what initiates a let's get off spreadsheets is because you will not make an audit requirement. So you now have the burden of going back in history and having to capture a full year's worth of transactions. Those are decisions that get into the starting point that Tamasin was talking about.

36:55

KARI FOSTER Great. Thank you. We have time for one more question. Adriana asks, How do you suggest training to be done to a large group? She goes on, Per activity similarity, per phase of the trade activity. So you're starting the training just for the traders, then the shipping team, then the execution operation. How do you train the teams?

37:22

TAMASIN Definitely by activity similarity, as she called it. With the front office or the back office being trained together, sometimes even the subgroups of accounting being trained together. But as I mentioned during the presentation, I also really do encourage that the early stages of training are happening almost organically along the way so that you are not getting to the end and people are logging into Molecule for the first time or logging into whatever system you're implementing for the first time, but instead that you've got a bit of that familiarity and all you're really doing is nailing down the specifics. I think that in general, you'll have... Most people have a small number of tasks that they need to do. They don't need a large training or a long training. What they need to do is very clearly understand. I'm entering a trade in this way, and then sometimes I actualize my daily volume and I have to do it on this screen in this way. Sometimes the trade has been locked and I need to get it unlocked so I can correct it. Then this is the process I'm going to follow. You get this shortly a list of, I guess you could call them user stories, might be a way to describe it, and build the training around a group of users that have the same user stories.

38:56

Generally speaking, if you make the training too formal, you're not going to have it very often because it takes time out of everybody's day and so on. But that's why for that repetition aspect, I really do like it as you're getting requirements, as you're demonstrating and doing your reconciliation and so on, that you're also just sprinkling the training throughout.

39:24

ALEX CHANDY Yeah, I would just add to that is you have sort of two tests that you're doing. The first is obviously you want to be able to test end-to-end from deal capture all the way to actualization. And that is a function exactly what Tamasin is talking about is assuming you get trades into the system, deals into the system, and especially ones that folks can recognize, they feel more commonality. You have the ability to simulate different as updates. You can go, OK, for the traders, they want to make sure that the trades that are captured, that the position is reflective of what they see it to be, the mark-to-market and the P&L from a valuation perspective, that is correct. As you can flow the trade through its lifecycle events, you can bring folks in to train them, specifically the tasks. Risk managers have the need to look at positions, to look at total mark-to-market exposures. When you get to operations, simulate end-of-day or end-of-month actualization into physical and then tickets that turn into inventory and then back office taking that inventory through a sale, through a invoice extract. That's the end-to-end life cycle. The ability to simulate that and not have to wait til the middle of the month or the end of the month in order to get those behaviors is massively helpful in training the different aspects.

41:01

And then, Tamasin's other point is there are workflow issues, right? So middle office needs to be able to lock and unlock trades. There are permissions that need to be granted. How do I enter? How do I set up a user, remove a user? Those can be bundled up as workflow training aspects. And that's how you manage the team. So it's not we have everything in, and then let's run a test over one month so we can do everything as the clock clicks through the appropriate time of the month. It's doing it through the configuration and setup process so that as you get closer and closer to go-live or run parallel, you have teams that exactly know what it is that the system can and will do for them, what reports they can generate, and then what the handoff looks like to their upstream or downstream colleagues.

41:56

KARI FOSTER Awesome. Well, thank you both. This was very helpful. And so this brings us to the end of our webinar. So first of all, thank you, Tamasin. Thank you, Alex, for joining today, for imparting some really great wisdom and helpful practical advice. Thank you all again, and have a great rest of your day.

42:20

ALEX CHANDY Thank you.

42:21

TAMASIN FORD Thank you.

Get a Demo