Radical Influence Podcast Ep. 1 Horizon Monitoring and the Need for AI Technology Solutions

Podcast: Horizon Monitoring and the Need for AI Technology Solutions

In this one-hour long podcast episode, Rick Ferraro, host and founder of Radical Influence Network, discusses horizon monitoring and the need for AI technology solutions with Pete Fitzsimmons, CEO & Co-Founder of Contexture AI.

One of the six pillars of Radical Influence is “Situational Awareness” powered by AI. The Network has partnered with Contexture to provide the essential tools to utilize the power of artificial intelligence to surface crucial information to help us understand the issue landscape. Pete explains the technology powered by Contexture and helps us understand how we can leverage it for maximum advantage.

00:00:00.040 –> 00:00:01.360
move it from their editor.

00:00:01.940 –> 00:00:03.050
Okay we are recording.

00:00:04.940 –> 00:00:20.160
Welcome to the radical influence network podcast or RIN this is a free collaborative group of professionals who want to improve their performance of how they interact with stakeholders advocate for and act as stewards of the organization’s reputation.

00:00:21.140 –> 00:00:28.660
Today we have topic is the horizon monitoring and the need for AI technology solutions.

00:00:29.140 –> 00:00:39.750
Our speaker is our guest today is Peter Fitzsimmons the co founder and CEO of Contexture AI, an artificial intelligence technology company headquartered in Reston Virginia.

00:00:40.540 –> 00:00:50.760
They use natural language processing, machine learning and artificial intelligence to power their technology services and two popular ones are Bill Watch and Media Watch.

00:00:51.440 –> 00:01:07.260
We’re going to hear from Pete on the need for AI in gaining situational awareness for organizations across the information landscape and explain how the technology works and why it is essential to keep up with the flood of information that we face in understanding the issue landscape.

00:01:08.040 –> 00:01:21.760
This information is directly tied into one of the pillars of radical influence, situational awareness powered by AI that are considered to be essential elements of achieving radical influence.

00:01:21.880 –> 00:01:36.850
In full disclosure, I have known Pete for several years I subscribe to Bill Watch which is helpful in my work and Contexture powers the media feed using their product Media Watch that is available on the radical influence network website for members.

00:01:36.870 –> 00:01:43.760
So Pete let’s begin, could you tell us a bit about your background and what led you to forming Contexture.

00:01:44.740 –> 00:01:48.390
Good morning Rick and thanks so much for having me on the program today.

00:01:48.390 –> 00:02:05.930
I’m thrilled to be here and to talk a little bit about artificial intelligence and the state of technology here as it relates to this pillar of RIN’s, one of the six pillars in and around artificial intelligence and cognitive science.

00:02:05.940 –> 00:02:31.050
So Contexture was formed approximately three years ago, a partnership of mine and my CTO Craig Lovell that I who I’ve been working with for a little over a decade, Craig and I met and work together at a SAAS enabled technology business here in the northern Virginia area called Primatics Financial.

00:02:31.640 –> 00:02:41.460
And the business was focused on providing risk management and compliance solutions into the regulated banking market.

00:02:42.240 –> 00:02:46.160
So that was really the start of my technology journey here.

00:02:46.170 –> 00:03:00.500
Craig is a computer scientist, has a fabulous graduate degree from the University Carnegie Mellon up in Pittsburgh and this area of artificial intelligence as it relates to natural language processing.

00:03:00.500 –> 00:03:04.810
Machine learning has been a focus for him for well over two decades.

00:03:04.820 –> 00:03:26.160
So after Craig and I had the opportunity to exit from Primatics and we’re looking around for other things to do the opportunity here to take artificial intelligence and to apply technology to quantitative analysis of unstructured content was becoming more and more mainstream.

00:03:26.840 –> 00:03:34.190
There were a history of providing these types of solutions in parts of other industry verticals.

00:03:34.190 –> 00:03:42.690
The legal services industry uses this type of technology to perform discovery on large amounts of content.

00:03:42.690 –> 00:04:08.430
The Enron case is a good example of using this type of technology to identify interesting patterns in unstructured data or unstructured content so Craig and I formed Contexture with the specific focus and mission of bringing this type of technology into the unstructured content market across multiple industry verticals, financial services, energy, manufacturing and the like.

00:04:08.440 –> 00:04:28.860
Where we can help organizations and enterprises to better understand what is transpiring in their marketplace and understand the landscape of information that is just exploding Rick in exponential factors of the amount of content and its complexity.

00:04:29.240 –> 00:04:35.660
So the need here for technology to be a real practical solution is very palpable for many organizations.

00:04:36.140 –> 00:04:57.560
So I’m really excited about talking through with you what the technology sort of represents, how the technology can be applied and discussing some real world anecdotes with some Fortune 100 companies that we’ve been working with and providing them with some strategy and insights here around how to apply artificial intelligence to some of their most pressing business challenges.

00:04:58.340 –> 00:05:07.560
So Pete, there is probably everyone listening to this podcast has some kind of news feed from their organization or that they subscribe to?

00:05:08.720 –> 00:05:15.460
in what way is what you’re offering different and you know, maybe this will emerge in the conversation.

00:05:15.840 –> 00:05:19.650
But how is that different than what everyone is using today?

00:05:20.040 –> 00:05:33.810
That’s great question Rick and and really represents the differentiation between old world technologies, old world approaches to identifying content and aggregating it.

00:05:33.810 –> 00:05:42.080
This is how these current technologies are really differentiated and have such tremendous promise and an opportunity around them.

00:05:42.090 –> 00:05:52.030
So if your traditional user of media services, maybe you have a profile set up where you’re federating information and media content from multiple sites.

00:05:52.040 –> 00:05:58.930
Typically those types of services you put in some key words very similar to how you would do a Google search.

00:05:58.940 –> 00:06:27.290
You put in a couple of key words and you aspire to have some level of response from either the federation service or from Google to give you meaningful results of content that comes back and and these, you know, these Old World technologies as we like to call them, they were born out of necessity, you have to have some way of querying the data and so everybody has become extremely astute at using Google and putting in keywords.

00:06:27.300 –> 00:06:37.360
The challenge with that approach, however, is is that there’s no context surrounding the query terms that you’re providing into the system.

00:06:38.040 –> 00:06:42.880
So as you’re expressing “bark” is a great example.

00:06:42.890 –> 00:06:46.640
Do you mean bark of a dog or the bark of a tree?

00:06:46.650 –> 00:06:50.450
Or perhaps you’re referring to an old world sailing ship?

00:06:51.210 –> 00:06:59.690
that was the purview of Sir Francis drake and and the Golden Hind was a bark, B A R Q U E.

00:06:59.700 –> 00:07:03.930
And so context today has become increasingly important.

00:07:03.940 –> 00:07:21.220
However, these technologies – Google etcetera and other Old World search technologies aren’t able to provide meaningful results back to users when they’re doing their work and when they’re applying their queries into the system.

00:07:21.220 –> 00:07:36.910
So artificial intelligence here as it relates to natural language processing and our machine learning algorithms are intended to provide context to the inquisition, the interrogation and the curation of content.

00:07:36.920 –> 00:07:47.750
And we’re able to use these technologies to apply math to express context into the query into the search that’s being performed by the user.

00:07:47.750 –> 00:08:10.090
And and really the full breadth of the platform work is intended to actually completely reverse the whole dynamic where instead of the user of trying to pull data out of the repository, the technology is really surfing if you think of content as a huge lake of information.

00:08:10.110 –> 00:08:18.600
And these technologies surf that lake and we’re pushing results back to the user based upon their initial expression of what they’re interested in.

00:08:18.600 –> 00:08:22.560
But then also the machine learning component represents this ongoing interaction.

00:08:23.140 –> 00:08:31.450
Yeah, I I use Google alerts and probably far less than a third are actually useful.

00:08:32.040 –> 00:08:35.680
The finds that I get because of that keyword?

00:08:36.320 –> 00:08:42.760
you know, find now you know, if I’m busy, this is gonna be a problem.

00:08:42.760 –> 00:08:43.250

00:08:44.140 –> 00:08:51.500
So is this gonna save me time you know, through this AI technique?

00:08:51.680 –> 00:08:53.930
Is that is that the part of the objective?

00:08:53.940 –> 00:08:59.930
That’s that’s certainly part of the objective is to improve the efficiency of the user.

00:09:00.590 –> 00:09:12.890
number one, so that you’re spending less time going through irrelevant results and essentially having to curate what gets returned to you to getting to the substance of what you’re interested in.

00:09:12.890 –> 00:09:16.650
So there’s an efficiency aspect in that particular regard.

00:09:17.140 –> 00:09:32.550
Then there’s also the flip side of that which is effectiveness and there are results that contextually could be very relevant for a user but they may not be expressed using the keywords that the user is input to the system.

00:09:32.940 –> 00:09:40.490
So this is where technology is a completely objective arbitrator of these types of determinations.

00:09:40.490 –> 00:09:46.180
And so technology is going to look at context and it doesn’t care what language it’s expressed in.

00:09:46.180 –> 00:09:53.830
It doesn’t it doesn’t need to understand what the specific words are that can be used to talk about a particular topic.

00:09:53.840 –> 00:09:57.160
What’s really important here is context and so there’s efficiency.

00:09:57.540 –> 00:10:05.540
Then there’s also effectiveness where we’re able to improve the precision of of the results that are being returned.

00:10:05.550 –> 00:10:10.850
The key thing there Rick as as your users, they sort of think about this business challenge.

00:10:11.240 –> 00:10:12.670
It’s two fold right.

00:10:12.680 –> 00:10:21.960
The one aspect is the level of precision how how relevant are the results that are being returned And then there’s also this aspect of recall.

00:10:22.440 –> 00:10:29.060
Am I capturing substantively all of the universe of content that would possibly be relevant to me.

00:10:29.540 –> 00:10:45.460
So there’s this wonderful, healthy conflict between precision and recall that where those algorithms are constantly fighting back and forth around what content really needs to be surfaced so that we can focus in on both efficiency and effectiveness.

00:10:45.460 –> 00:11:04.550
And it’s really where we try and focus our time in sharpening our algorithms, sharpening our technology to be able to look at that that fat in the middle of the bell curve where there is some nuance and some decisioning that really needs to be made and this is where the machine learning component kicks in as well.

00:11:04.550 –> 00:11:08.770
This is where human intervention is a necessary part of the solution.

00:11:08.790 –> 00:11:12.720
Technology is not a silver bullet but it certainly can improve a lot.

00:11:12.730 –> 00:11:14.350
A lot of the current landscape looks like.

00:11:15.340 –> 00:11:24.060
So Pete, you mentioned earlier that you may have some real real world examples, is there anything you can give us that kind of brings us to life?

00:11:25.040 –> 00:11:32.030
Yeah, there’s a, there’s a number of different use cases, anecdotes and business challenges across the spectrum.

00:11:32.030 –> 00:11:58.550
Within the enterprise organization we’ve had conversations and have advised clients, Fortune 500 businesses, that are running into inordinant business challenges around how to manage the exponential volume of content and information that’s being produced just by their organization, let alone when you take into account the full breadth of what that might represent for a global company.

00:00:00.000 –> 00:12:40.160
, it’s as the number of use cases can be as simple as helping a one of the largest government services integrators in the, in the country, if not in the world helping their legal services team and their operations team just to inventory all of the contracts that they have within their organization across the globe to understand what obligations do we have with which counter parties and how we are compliant with the regulations and the expectations for regulators in multiple jurisdictions.

00:12:40.740 –> 00:13:14.760
This integrator was challenged with just trying to understand how they can quantify for their primary regulator, how many transactions and contracts they had in place with with a particular counter party. Another organization, a global energy business, our good friend there, our client John, is struggling with the topic that I think we’re going to dive into here today around gaining situational awareness of all of the changes that were occurring for them across the global regulatory landscape.

00:13:15.140 –> 00:13:21.850
And this topic of horizon monitoring is something that was strategically important for them.

00:13:21.850 –> 00:13:32.290
They were being impacted substantively from a financial and regulatory standpoint by changes that were occurring across the global regulatory landscape.

00:13:32.290 –> 00:14:17.850
And they felt very reactive and the expectations from senior management, from the executive, from the board, was to lean into these regulatory issues as part of their business, which as I mentioned, is in the energy market, tremendous scope there of global regulation at various degrees of maturity around not just production operations, but also in those operations and in those localities where they were decommissioning large plants and there is a tremendous amount of community impact and essential stewardship that this organization, this company was obligated to.

00:14:17.860 –> 00:14:32.690
And so it’s a great illustration of how the technology is a very necessary component of what an overall solution looks like in being able to really be effective and truly performing horizon monitoring.

00:14:32.700 –> 00:15:03.400
And so this is, this is a 360 degree angle periscope to look externally to the company about regulatory, legislative, policy changes, public opinion, decommissioning requirements and those kinds of things, all determined and found from one solution, is that, is that what I’m hearing you describe?

00:15:03.410 –> 00:15:10.980
That’s their aspiration and that’s where we’ve been advising them on how to bring that solution together right there.

00:15:10.990 –> 00:15:11.410

00:15:11.410 –> 00:15:18.620
They’re challenged by there isn’t enough off the shelf technology solution that represents that today.

00:15:18.620 –> 00:15:27.830
There are companies that are trying to address various aspects, for example, stakeholder engagement and being sensitive to what’s happening at the local community level.

00:15:27.830 –> 00:15:58.070
There are some organizations that are looking to put together workflow type platforms that can help companies organize themselves around those agendas, but it’s the synthesis of what you just described, the huge cross section of data sources, some structured, mostly unstructured, that can have just enormous impacts from a reputation and from a branding standpoint, from the costs associated with regulatory compliance etcetera etcetera.

00:15:58.140 –> 00:16:20.320
They’re endeavoring to be proactive and establishing a very comprehensive solution here that takes all of the array of different information sources and synthesizes that through a risk scoring platform that can enable them to be successful in understanding, where is the risk and how does that risk manifest itself then?

00:16:20.320 –> 00:16:24.490
How do we strategically and tactically deploy resources to be proactive and managed.

00:16:24.500 –> 00:17:00.310
So what I’m hearing is that situational awareness is not just being aware that there’s an article about something related to a local plant or something like that but that it is related to an assessment of risk for the organization and having visibility to that risk is the purpose of situational awareness, not simply just finding the needles in the haystack about the articles that are relevant, it’s a two step process.

00:17:00.310 –> 00:17:20.050
The first is find the right articles, find the right information, find the right proposed regulations, the new bills, whatever it might be, but then also judge them and how much of that is technology driven versus human judgment.

00:17:20.840 –> 00:17:22.030
Great, great question.

00:17:22.030 –> 00:17:35.560
I think you’ve hit the nail on the head relative to a lot of the monitoring solutions that are out there today, for example media monitoring solutions from some of the larger players Meltwater, Cision etcetera.

00:17:35.940 –> 00:17:52.160
They facilitate surfacing potential, interesting pieces of information but they don’t go that next step, which I think one of the pillars here for RIN around issues management and to be effective in issues management.

00:17:52.540 –> 00:18:04.860
one has to not just measure, you also need to then to be able to manage back against what you’re identifying and so scoring understanding from a risk standpoint or opportunity standpoint.

00:18:05.940 –> 00:18:19.060
where should strategically resources be dedicated, tactically and strategically it becomes a very important component of a proactive and an effective horizon monitoring solution.

00:18:19.440 –> 00:18:19.770

00:18:20.440 –> 00:18:35.320
So the seed of a new public affairs government affairs program is actually discovered in the situational awareness component that oh we should, this is now a big risk.

00:18:35.330 –> 00:18:40.560
We should have a program to address this either internally or externally or both.

00:18:41.040 –> 00:18:53.220
But we should be quote doing something about this risk, this issue that we have now surfaced and can quantify in terms of how that it’s a lot bigger than these.

00:18:53.220 –> 00:19:00.330
Other risks may not be a perfect measurement but you can do relative measurement and run from there.

00:19:00.340 –> 00:19:18.140
So the situational awareness is not only using a periscope but it’s also seeding future program, ultimately leading to future program activity of the externally oriented influence function that RIN focuses on.

00:19:18.150 –> 00:19:18.830
Is that right?

00:19:18.840 –> 00:19:19.810

00:19:19.810 –> 00:19:21.050
Yeah, no question.

00:19:21.050 –> 00:19:39.370
And so it’s really the weaponization of the periscope by understanding distance to target, understanding you know what the settings should be in tuning the torpedo so that when it arrives at its target, it’s going to have the designed impact.

00:19:39.370 –> 00:19:46.700
So situational awareness is fundamentally much more than just purely surfacing information.

00:19:47.120 –> 00:19:48.600
there needs to be analysis.

00:19:48.610 –> 00:20:18.580
It needs to be in context for the organization and as you said in retooling and facilitating digital transformation for government affairs and public relations professionals being able to speak to the rest of the organization in a quantified or quantifiable context is the question that’s being asked by the chief executive and by the chief financial officer, just surfacing these information points, these risks, is one thing.

00:20:18.580 –> 00:20:26.830
But then providing context of those risks relative to the enterprise and putting an estimate around it.

00:20:26.840 –> 00:20:36.580
Sure no one’s looking for tremendous position here in terms of how the estimate may ultimately resolve itself in terms of dollars.

00:20:36.580 –> 00:20:53.910
But you need to be able to talk in the language that the C suite is really used to interacting around and the government affairs, public relations communications and advocacy groups need to hold themselves accountable here in terms of managing to results.

00:20:53.940 –> 00:21:20.290
So situational awareness, horizon monitoring necessitates some discipline that really comes out of sort of the risk modeling world. A number of different participants and a good friend here John, in particular, speak to data science and data analytics as it relates to horizon monitoring and it really represents that second element work of being able to take the results.

00:21:20.300 –> 00:21:26.970
But now synthesizing and analyzing those results to understand how they are going to impact the organization.

00:21:27.140 –> 00:21:32.960
It’s the only way that the fundamentals of decision making can really be be strategically thought through.

00:21:33.640 –> 00:21:45.060
So early on in this conversation I noticed you talk about multiple technologies plural and then sometimes technology collectively singular.

00:21:45.840 –> 00:21:50.480
I’m imagining that there’s multiple underlying technologies.

00:21:50.490 –> 00:21:54.350
I know natural language processing and machine learning and so forth.

00:21:54.360 –> 00:21:59.160
Could you tell us a little bit about each of these so that it becomes a little bit more real?

00:21:59.540 –> 00:21:59.770

00:22:00.140 –> 00:22:01.340
Absolutely happy to.

00:22:01.340 –> 00:22:26.170
And I’ll do my best here to provide some some basic fundamentals as a non technologist can and hopefully explain some of this in layman’s terms the platform as we like to refer to it, as really a integrated process chain through which we apply mathematics to words.

00:22:26.540 –> 00:22:30.510
So we’re literally ingesting unstructured content.

00:22:30.510 –> 00:22:35.610
We’re taking in media – news media, articles, we’re taking in blog posts.

00:22:35.620 –> 00:22:44.310
We can take in the transcription of podcasts, we can take in regulatory documents and legislative documents.

00:22:45.310 –> 00:22:46.930
and we consume those.

00:22:46.930 –> 00:22:47.670
We ingest them.

00:22:47.670 –> 00:22:55.840
And we literally run a pipeline of different curation and synthesis of those words.

00:22:55.840 –> 00:22:58.550
We literally count words, Rick is what it comes down to.

00:22:59.240 –> 00:23:17.610
So in counting words, we’re able to then apply some statistics around understanding what words in that document are statistically significant, particularly when we compare that phrase, that paragraph, that news article.

00:23:17.620 –> 00:23:24.060
When you compare the statistics of what words that are contained there and you compare it to what we call a corpus.

00:23:24.540 –> 00:23:28.410
So essentially we build up a large body of documents.

00:23:28.420 –> 00:23:37.590
A large body of content we’re able to ascertain what’s the statistical significance of a particular word.

00:23:37.600 –> 00:23:43.360
It could be glyphosate as an example coming out of the chemical industry.

00:23:43.740 –> 00:23:52.060
And glyphosate as you would guess is not in common parlance, people don’t use it generally in conversation at least I certainly don’t.

00:23:52.440 –> 00:24:12.500
So if glyphosate in talking about pesticides is a particular interest to a particular organization, we’re able to use not just the word but also the context that goes around that word in understanding statistically how significant is that word and its context in in a document or a group of documents.

00:24:12.500 –> 00:24:15.700
And we compare those statistics to a much larger body.

00:24:15.710 –> 00:24:25.080
So certainly volume here is important when we’re doing any of this sort of statistical analytics and that’s really what the technology is doing at the heart.

00:24:25.080 –> 00:24:31.660
There’s a synthesis or a breakdown of content into statistically counting words.

00:24:32.040 –> 00:24:43.220
And then once you’ve done that initial analysis, you, we’re able to apply all sorts of interesting mathematics back against the statistical significance of the results.

00:24:43.230 –> 00:24:54.360
And the most important aspect of what we have in our platform is an algorithm that calculates mathematically a document vector.

00:24:54.740 –> 00:25:05.060
So if we’ll cast our minds back to our old math classes, A vector is a line that is expressed in essentially three dimensional space.

00:25:07.000 –> 00:25:08.990
Math is a little more complicated than that.

00:25:08.990 –> 00:25:16.790
But essentially what we do is we calculate the vector of a document and we’re able to then take that mathematical result for two documents.

00:25:16.790 –> 00:25:22.160
We’re able to compare them against each other mathematically to see what’s the delta.

00:25:22.160 –> 00:25:29.490
What’s the – it’s actually the cosign difference between the two documents and that gives us an element…

00:25:29.490 –> 00:25:32.260
And an element and in essence of similarity.

00:25:32.940 –> 00:25:39.060
So we’re able to now use math to quantitatively determine how similar two documents are.

00:25:39.060 –> 00:25:42.450
They really close in context or they much further apart.

00:25:43.040 –> 00:25:46.770
So that gives us a tremendous tool to do all sorts of interesting things.

00:25:46.780 –> 00:25:54.190
So this really is going to help identify articles.

00:25:54.190 –> 00:26:01.120
Let’s just say we’re talking about articles that are relevant and eliminate those that are not.

00:26:01.340 –> 00:26:08.760
Is that the main purpose of this measuring the angle of difference between the vectors?

00:26:09.140 –> 00:26:10.940
Yeah, that’s correct.

00:26:10.950 –> 00:26:15.240
It does it helps with both recall and precision.

00:26:15.240 –> 00:26:24.670
So it helps in the efficiency of just targeting those articles that are very relevant to the context that we’re clearing the repository around.

00:26:25.040 –> 00:26:37.450
But it also helps with effectiveness because now we’re able to identify articles that ordinarily, if you look at the title or if you look at the body of what most of the article was around, you wouldn’t have identified that.

00:26:37.450 –> 00:26:46.340
Oh, down in this second half of the article, it’s talking specifically about pesticides and what’s happening in that marketplace.

00:26:46.340 –> 00:27:13.770
And so our technology is critically important and not only more efficiently culling out the chaff of what’s irrelevant in documentation and in content, but it’s also really important contextually and surfacing those documents that ordinarily wouldn’t be identified or captured because the tools that historically have been used for this type of exercise don’t look at context.

00:27:14.240 –> 00:27:15.740
So I could see some relevance.

00:27:15.740 –> 00:27:27.770
And, certain proposed bills that have some hidden passages that are on some irrelevant topic that are really hurting, hurtful to an industry or a profession or whatever.

00:27:28.140 –> 00:27:36.270
And, you know, if you don’t have a good search mechanism that you would come across that absolutely risk.

00:27:36.280 –> 00:28:02.950
Yeah, you’re right in the legislative market in particular, where adversaries and counterparts, we’ll go out of their way to avoid using key terms in trying to craft legislation that either positions their advocates to be in a much better posture than the rest of the competitive landscape or conversely could be punitive against a particular group that they’re trying to target.

00:28:03.340 –> 00:28:05.910
That’s what lawyers, that’s what lawyers are for, right?

00:28:05.920 –> 00:28:09.850
That’s what lawyers, it’s called full employment.

00:28:09.860 –> 00:28:31.170
But at the same time, one of our clients that subscribes to our bill watch application, had this exact situation arise where they were focused on legislation and regulation being introduced against contractors in the residential security market and the legislation that was put forth in the state of Maryland.

00:28:31.540 –> 00:28:38.260
At no point did it mention anything about residential security or home security alarm systems or anything of that nature.

00:28:38.640 –> 00:28:45.160
In fact, they used a phrase, “low energy device monitor” was what the phrasing was.

00:28:45.740 –> 00:28:50.750
The rest of the language was very consistent with that particular industry vertical.

00:28:50.750 –> 00:29:05.350
But if you were just looking for key words, you wouldn’t have identified this piece of legislation, it would not have popped up because none of the typical industry phrases were being used in what was being drafted because of our friends in the legal services market.

00:29:05.740 –> 00:29:12.300
But fortunately with Contexture’s artificial intelligence, we don’t pay attention to specific keywords.

00:29:12.300 –> 00:29:14.010
We look at the overall context.

00:29:14.020 –> 00:29:40.170
We were able to surface those bills as soon as they were being introduced as drafts for our client and they were able through that situational awareness to them, quickly organized their members to counteract what was being pushed in the Maryland legislature, able to defeat the bill and the finish because of the attention that we were able to bring the light that we’re able to shed on what was being put forth.

00:29:41.040 –> 00:29:44.800
So you mentioned earlier, number of algorithms and so forth.

00:29:44.800 –> 00:29:51.360
I know that, that this capability quote “gets better with time”.

00:00:00.000 –> 00:29:53.960
, so it’s a machine learning.

00:29:53.960 –> 00:29:57.040
I’m assuming that is really at play here.

00:29:57.050 –> 00:30:01.060
Can you explain that to how that works to the uninitiated?

00:30:01.440 –> 00:30:03.460
Yeah, that’s another great question.

00:30:03.460 –> 00:30:46.360
Because the expectation around using artificial intelligence or neural networks, many, uneducated or those that are being introduced to this area, that of artificial intelligence for the first time, there’s an expectation that there is a necessity to have large bodies of pre-curated or hand tagged training sets that can essentially configure the engine to be able to do the work around automating this search for highly relevant content and at Contexture, we’ve taken a completely different approach to that.

00:30:46.360 –> 00:30:59.450
We don’t need tens of thousands of documents that have been hand tagged and have been pre-curated by manually by a huge, huge team to create a training set.

00:30:59.460 –> 00:31:12.840
Our AI engines are built off of this consistent interaction, this machine learning exercise where we are bootstrapping the tuning of the engine on an ongoing basis.

00:31:12.850 –> 00:31:30.070
So the engine gets better with time because of the continuous interaction with the human supervision of users that are interacting with the recommendations of highly important and relevant content that are being produced out of the engine.

00:31:30.080 –> 00:31:37.020
And then the user interacts with that content and provides feedback through sort of a thumbs up, thumbs down algorithm.

00:31:37.030 –> 00:31:40.340
The response is yes, this is on target.

00:31:40.350 –> 00:31:54.750
And we then also have a workflow that enables the user to create custom tags or topics to classify the content and as they’re engaging with the content in that manner on an ongoing basis.

00:31:54.800 –> 00:32:08.860
It enables the engine to get smarter as well as it provides an avenue for the user to do new discovery where there could be new topics that are on the periphery of the areas of interest that they’ve already expressed.

00:32:09.240 –> 00:32:17.470
We’re able to surface those new discovery points and that enables the user then to say, hey, here’s a topic that’s of interest.

00:32:17.470 –> 00:32:30.270
To me, it’s a little bit nuanced from what I’ve been focused on previously and there’s a lot of this in the energy market again around decarbonization efforts in the electrification, to disintermediate fossil fuels.

00:32:30.640 –> 00:32:34.920
The discovery of what the conversation looks like.

00:32:34.920 –> 00:32:44.320
Derek is really important to have this type of technology approach to boot strapping the engines so that constant feedback loop is there.

00:32:44.330 –> 00:32:47.120
That’s what, that’s what machine learning and forming an engine.

00:32:47.370 –> 00:32:51.940
So if, I don’t know, this is a fair question or not.

00:32:51.940 –> 00:33:10.850
But how much time is this going to save the typical, person in an influence function in a corporation, that are, that they’re spending, monitoring, bills and regulations and, you know, reading their media feed and so forth.

00:00:00.000 –> 00:33:13.880
, is this, does this cut it in half?

00:33:13.880 –> 00:33:14.560
Does this?

00:00:00.000 –> 00:33:18.760
, more than that, you know, what, what, what kind of benefit?

00:00:00.000 –> 00:33:22.470
, and maybe maybe time isn’t the only benefit.

00:33:22.480 –> 00:33:32.270
But you know, what, how much, how much does this really help the person on the ground who’s, you know, inundated with meetings?

00:33:32.740 –> 00:33:42.270
Always ask the answer would be ready with fresh information, you know, to be called into a briefing for the CEO, you know, how how is this going to help them?

00:33:42.740 –> 00:33:44.070
It’s great.

00:33:44.080 –> 00:34:06.790
Another great question because the value here of this type of technology manifests itself in a couple of different ways for these scenarios that you just described in government affairs and public relations teams that are currently using some of the old school technologies to perform certain aspects of situational awareness or horizon monitoring.

00:34:06.790 –> 00:34:09.820
Where they’re accessing content.

00:34:09.820 –> 00:34:24.760
They’re either looking through news articles or they’re looking at legislative pronouncements, regulatory pronouncements and they’re using some old school tools to do that. In our estimation and our experience with our clients.

00:34:25.440 –> 00:34:31.020
The opportunity here is an efficiency savings are probably 75%.

00:34:31.030 –> 00:34:42.260
So if they’re currently dedicating, call it 10 hours a week in reviewing the huge amounts of content and information and data that is coming back to them.

00:34:42.740 –> 00:34:49.580
We think we can save them 7.5 hours a week relative to that exercise using artificial intelligence.

00:34:49.890 –> 00:35:12.500
And really that and that’s the focus here is to automate a lot of the effort that goes into hand reviewing and manually reviewing a lot of irrelevant returns that come back from the and that’s once you reach steady state that takes a little while to go from current situation too to saving 7.5, 5, 10 hours.

00:35:12.510 –> 00:35:17.360
How long is that transition time, Is that weeks or months or?

00:35:18.240 –> 00:35:20.330
Yeah, typically, yeah.

00:35:20.330 –> 00:35:50.160
The timeline to realizing those sorts of results is actually very short again because our approach to the implementation of artificial intelligence here is to boot strap the engines through our machine learning exercises to get a steady state of maturity around the context for a particular issue or topic that you’re focused on the critical mass to achieving that level of consistency and performance is not not inordinant.

00:35:50.340 –> 00:35:51.860
It’s not significant at all.

00:35:52.040 –> 00:35:54.970
It’s 20 to 30 documents.

00:35:55.340 –> 00:36:02.730
So so you feed some documents to the system to say I’m looking for stuff like this.

00:36:02.740 –> 00:36:04.480
Is that right?

00:36:04.490 –> 00:36:05.660
That’s correct.

00:36:05.670 –> 00:36:06.600

00:36:06.600 –> 00:36:26.850
So if you’re an organization that already has captured documents, whether it’s draft legislation, regulation, news articles, whatever else it might be if you already have an existing body of content that that serves as our initial boot strapping of the content that is instructive in telling the technology here’s what I’m looking for.

00:36:27.330 –> 00:36:37.140
So we can get you up and running literally in days and achieving those sorts of results, in a couple of weeks as opposed to a couple of months or a year.

00:36:37.270 –> 00:36:42.850
The maturity of the platform can be achieved very quickly.

00:36:43.430 –> 00:36:54.750
With that being said, the flip side here is relative to horizon monitoring and a good friend John without having technology like ours as part of the solution.

00:36:55.330 –> 00:37:00.550
The problem is so large and just so pervasive and so overwhelming.

00:37:00.930 –> 00:37:09.840
It would literally be impossible to do without technology at the heart to be able to do it efficiently, effectively and sustainably.

00:00:00.000 –> 00:37:26.160
Organizations historically Rick when they’ve tried to attempt this type of horizon monitoring where you’re synthesizing large amounts of content and then applying some level of consistent risk scoring back against that data.

00:37:26.630 –> 00:37:43.660
They’ve had limited degrees of success because they are working with partners that are throwing huge bodies of human labor at doing the number crunching, so to speak, to doing the manual reviewing at the front end of the curve.

00:37:44.330 –> 00:37:56.910
So those are the reasons why our partner here John hasn’t been successful to date in truly implementing a horizon horizon monitoring solution that incorporates the analysis of unstructured content.

00:37:56.910 –> 00:38:07.100
But then also this consistent risk scoring is because the other type of technology has not been expressed into that problem set historically.

00:38:07.110 –> 00:38:13.010
So we’re really at the cutting edge and it’s sounding to me like in the legal area.

00:38:13.020 –> 00:38:15.040
You know, you have a big lawsuit.

00:00:00.000 –> 00:38:22.650
, you know, there’s an army of lawyers reviewing documents that they’ve requested.

00:00:00.000 –> 00:38:27.540
, and the document review process is very labor intensive.

00:00:00.000 –> 00:38:29.740
, takes a long time.

00:38:30.490 –> 00:38:31.710
slow going right.

00:38:32.810 –> 00:38:40.520
is this the same, is that our, this is the automation of that kind of work?

00:38:40.520 –> 00:38:40.790

00:38:40.790 –> 00:38:52.860
Because literally lawyers, I looked over the shoulder of lawyers sitting in a room and they’re literally highlighting key words and phrases and they literally had to read every document now.

00:38:52.860 –> 00:38:55.780
This was a few years ago.

00:38:55.790 –> 00:38:59.740
But you know, how does that relate to his?

00:38:59.920 –> 00:39:03.580
Has the legal document review evolved in the same way?

00:39:03.580 –> 00:39:05.580
And is this the same technology?

00:39:05.720 –> 00:39:06.530
Or is it different?

00:39:07.320 –> 00:39:08.330
That’s a great question.

00:39:08.330 –> 00:39:10.200
It’s very similar technology.

00:39:10.200 –> 00:39:45.250
So what the legal services market refers to as e discovery has significantly impacted legal services in the form of being able to automate a lot of the review work that you just mentioned and it provides the opportunity for the legal market, legal services market to be able to expand the universe of materials and artifacts that can be interrogated relative to a particular aspect of litigation.

00:39:45.820 –> 00:40:22.830
So the type of technology which is being used predominantly in that e discovery effort is really helping the market expand the universe of what could be considered because previously, unless it was economic and there was a fair amount of certainty and finding that needle in a haystack, the opportunity economically to dedicate the vast amounts of resources you just described in the litigation are in a case unless the economics were justifiable, lawyers were constrained in terms of the amount of review that could be done.

00:40:23.510 –> 00:40:36.320
But now, with the discovery and artificial intelligence being applied into those matters, attorneys are able to now really expand the view of all of the artifacts and materials that can be gathered up.

00:40:36.910 –> 00:40:42.890
The technologies that are being deployed there Rick are really particular to that market.

00:40:42.890 –> 00:40:48.240
However, because the results that have to be surfaced have to be admissible in a court of law.

00:40:49.110 –> 00:41:08.020
So there is a workflow and construct that goes around those types of solutions that are very particular to that use case and as a consequence that really lend themselves to a much broader implementation of how type of artificial intelligence into business challenges like horizon monitoring.

00:41:08.610 –> 00:41:28.390
So those technology businesses serving that market tend to be very niche and very much focused on constructing a very defensible workflow and a workspace within which attorneys can surface these items and then provide them to cordon as unmissable pieces of of evidence.

00:00:00.000 –> 00:41:39.960
, but you’re right on point, they’re very, very similar and at the at the core of the technologies are very much the same relative to using math to count words.

00:41:39.970 –> 00:41:51.140
And it’s interesting that the, the legal world, must, it really has no choice to do its job efficiently.

00:41:52.330 –> 00:41:59.130
they’ve had to adopt a discovery and use technology to do their work right.

00:00:00.000 –> 00:42:28.880
, and so I’m using your words and your description to justify my selection of situational awareness, being a pillar for RIN and that it has to be driven by artificial intelligence because if you’re big and complex, you can’t physically do the work of monitoring all the things you would like to monitor or you should be monitoring, you have to rely on artificial intelligence.

00:42:28.880 –> 00:42:32.420
It’s the only technology available to go do that.

00:42:32.430 –> 00:42:38.300
Am I, smoking my own dope here or a correct statement.

00:42:38.310 –> 00:42:39.860
You know, I think you’re spot on.

00:42:39.870 –> 00:42:45.930
It really does become a technology arms race relative to being competitive.

00:42:46.500 –> 00:43:11.870
And if your counterparts, your competitors, your partners are using this type of technology, which they are artificial intelligence is becoming more ingrained into the fabric of organizations and the enterprise and even our governments and as a consequence, if you’re not understanding how to deploy AI within your organization.

00:43:11.880 –> 00:43:28.310
You are putting yourself at a strategic disadvantage that will manifest itself across the full scope of any organization relative to financial performance, relative to market performance, recruiting and retention of key employees, etcetera.

00:43:28.310 –> 00:43:36.940
It’s a, it’s a necessary component of being an effective and an efficient team in public affairs and in government affairs.

00:43:36.940 –> 00:44:02.310
And so if you’re considering heading into a horizon monitoring solution, you need to be thinking about the data science, you need to be thinking about artificial intelligence and its technology as part of that solution because your competitors are and certainly all the other states stakeholders in the market are looking at how AI can impact how they perform relative to the horizon monitoring.

00:44:03.290 –> 00:44:10.810
So I know we’ve been talking about all the pros about artificial intelligence for this application.

00:00:00.000 –> 00:44:18.800
, are there any cons, what, what people, what are the trade offs people are making to go down this route?

00:00:00.000 –> 00:44:26.500
, what does, what do people need to be aware of as they go into thinking about getting on this road?

00:44:27.090 –> 00:44:29.180
Yeah, that’s, that’s really important.

00:44:29.180 –> 00:45:03.920
I think there are cons or considerations as opposed to perhaps cons relative to implementing artificial intelligence and the key aspect there that we consistently run into with how good friend John at the energy business and across the full spectrum of all industry verticals is just a true lack of understanding at the baseline of what the technology represents and how it can be most productive and provide a value proposition that can impact the business challenge?

00:45:04.490 –> 00:45:06.290
This isn’t a genie in a bottle?

00:45:06.290 –> 00:45:17.330
There is some construct here and there are situations in which the technology cannot be effective relative to whatever the business challenge might be.

00:45:17.340 –> 00:45:54.210
We had an example here in terms of a really difficult use case Rick was a client of ours, wanted to analyze all of the body of cyber security regulation that is being promulgated by regulators from the municipal level to the federal level across the full spectrum and the financial services market in particular, given how important cyber security is in that space, is incredibly over-regulated as financial services tends to be.

00:45:54.590 –> 00:46:23.210
And the aspiration was that our type of artificial intelligence would be able to quickly and easily identify where there was duplication and redundancy in financial regulation being promulgated by the whole host of different regulators that have their hands in that market and we’ll be able to surface that duplicity or the redundancy in regulation where multiple different regulators are asking for the same thing in different ways.

00:46:24.380 –> 00:46:32.100
Our technology was extremely effective in surfacing the areas for consideration.

00:46:32.680 –> 00:46:39.000
But there is a very much a need for human interaction with the technology in order for it to be effective.

00:46:39.880 –> 00:46:54.200
So, having the technology is part of a solution where there is that interaction with subject matter experts that understand the nuance of what you’re discovering or what you’re interrogating and curating is critically important.

00:46:54.780 –> 00:47:07.290
So the understanding of what can the technology be effective around and understanding what is natural language processing, how does machine learning really improve the solution, what do these technologies represent from my organization?

00:47:07.780 –> 00:47:12.560
Educating yourself before you’re going into that process is really, really important.

00:47:12.570 –> 00:47:28.910
And then establishing the resources and expectation around the fact that technology is not a silver bullet, you need people to be able to interact with these tools, interact with these technologies to continue to refine their effectiveness.

00:47:29.480 –> 00:47:36.650
So there isn’t just a magic bullet here where we just kind of turn on the artificial intelligence and like C.

00:47:36.650 –> 00:47:41.990
Three Pio, it will automatically be able to interact with everybody in your organization and understand what they’re talking about.

00:47:42.480 –> 00:48:31.850
So setting up the opportunity for success through an understanding what the technology can and can’t do and educating yourself around that and then setting a baseline expectation around the resources that you need to have on your team, whether that’s internal or external, that can be a part of that solution as you implement it and roll it out and that’s critically important for horizon monitoring, particularly on a global scale, where how technology needs to have the interaction and the integration of subject matter experts that understand the context of language and are working with the platform in order to make sure it’s getting the best and highest and fullest value is being extracted from it.

00:48:31.860 –> 00:48:52.330
That’s critically important as well and as I understand that there can be a number of algorithms and analytical approaches that are, that are used and it seems like the, that by industry vertical, the applications are fairly customized, to the application to make it work right.

00:48:52.340 –> 00:49:07.620
And you know, as an example, my first introduction to natural language programing was probably a decade ago in looking at the procurement departments spend.

00:49:07.900 –> 00:49:28.390
And so if you download the the accounts payable file from a large corporation, you can find that they buy vehicles from GM, G M A C, Gen Motors, General Motors and about five other different substitution, you know, abbreviations for that.

00:49:28.400 –> 00:49:37.740
So what literally we would do with that finding, let’s go back and say, do you know that you buy this amount of vehicles from General Motors.

00:49:37.740 –> 00:49:50.900
This is a perfect opportunity to go back to them and negotiate a better price since you’re such a volume buyer and you know, it was a pretty focused and it did take human interaction early on, especially because this was so long ago.

00:49:51.270 –> 00:50:06.080
But that, that nuance of the way language is used I thought was I used that is still to explain to people what natural language programing is today because it’s just a simple, simple example.

00:00:00.000 –> 00:50:23.190
, but the tools to do that, you know, there are many and you can assemble many different tools, algorithms to solve the same problem and you and, and it could be a totally different set of tools.

00:50:23.560 –> 00:50:28.080
So there it’s that’s the nature of this technology and the way it’s evolving.

00:50:29.060 –> 00:50:31.900
so which is really, really interesting.

00:50:31.900 –> 00:50:50.390
And so you really need to be careful that you’re selecting something that’s been proven and tested and proved to be useful for the application, not just run willy nilly in using a new, a new offering for for this because it sounds cool.

00:50:50.400 –> 00:50:50.750

00:50:50.760 –> 00:50:51.460

00:50:51.460 –> 00:50:51.750

00:50:51.750 –> 00:50:59.050
So having folks engaged from a teaming standpoint that understand the technology is really important.

00:50:59.060 –> 00:51:18.280
There are platforms out there, Google, Amazon, Microsoft, they have these wonderful cloud environments in which they can make their natural language processing, machine learning algorithms available to you and you can go out there and you can train your own engine.

00:51:18.290 –> 00:51:20.150
But your excellent point Rick.

00:51:20.160 –> 00:51:43.190
If you don’t have a partner that understands how to tune the technology properly or to your excellent point how to set up the various processes in the right sequence so that you are preparing the data for the type of synthesis that you want to do, whether it’s for the scenario you described as a little bit of what we call fuzzy matching.

00:51:43.760 –> 00:51:47.540
Where folks may misspell a name.

00:51:48.140 –> 00:51:55.910
if it’s a if it’s a proper organization or them in the skull, you know, some other more common part of, of conversation.

00:51:55.920 –> 00:51:57.890
So that’s sort of fuzzy matching.

00:51:57.890 –> 00:52:16.770
That is absolutely an algorithm and that needs to be implemented in what we call the value chain of the pipeline of specific tools that do different things, whether it’s what we call stemming, where you’re, whether the tents of the word is irrelevant whether it was swam, swim or swimming.

00:52:17.250 –> 00:52:19.010
We generally know what you’re trying to talk about.

00:52:19.010 –> 00:52:29.260
So we call that stemming where we go back to literally the root of the word, irrespective of its, it’s tense and how it might be applied.

00:52:29.850 –> 00:52:44.490
So having a partner that has a deep knowledge and understanding of the technology and can help guide you through an implementation process is also a real key aspect of how you set a project up for success.

00:52:44.490 –> 00:52:45.000
For sure.

00:52:45.070 –> 00:52:50.620
Yes, it’s really critical too Pete, are there any this has been really helpful.

00:52:50.630 –> 00:52:55.480
Are there any other thoughts you would like to offer to our listeners?

00:52:56.250 –> 00:52:56.480

00:52:57.150 –> 00:52:58.570
Yeah, in closing Rick.

00:52:58.570 –> 00:53:08.770
I think there’s a lot of uncertainty or concern about how artificial intelligence can be deployed.

00:53:09.450 –> 00:53:19.270
There’s a lot of conversation around bias in AI and there’s a lot of concern around artificial intelligence, taking people’s jobs.

00:53:20.710 –> 00:53:35.570
I think before the management team looks into a project, particularly around horizon monitoring, they should be thinking through how do we best take advantage of this technology and enable us to be more successful.

00:53:35.570 –> 00:53:50.190
So looking at artificial intelligence in the context of how it can be a huge benefit for your enterprise, in your organization and getting the buy in from the whole team from the executive suite C suite, all the way down to the line management.

00:53:50.190 –> 00:54:08.260
So my encouragement for folks who are thinking about technology and around AI technology and how to implement it successfully would be to address maybe some of those concerns upfront around whether it’s bias in the AI or whether it’s around how it’s going to be taking people’s jobs.

00:54:08.640 –> 00:54:14.410
That’s not the focus for this type of an implementation and the and the promise of what it represents.

00:54:14.410 –> 00:54:24.060
So that would be some consideration for engaging with all the stakeholders across the spectrum on how to achieve horizon monitoring and radical influence.

00:54:24.740 –> 00:54:26.340
Hey Pete thank you very much.

00:54:26.340 –> 00:54:31.860
I’d like to just close with one short commercial on the radical influence network website.

00:54:31.870 –> 00:54:35.280
Is the feed from media watch from Contexture.

00:54:35.280 –> 00:54:40.160
It’s available to any member of the radical influence network.

00:54:40.540 –> 00:54:51.770
Each of the tags are our live for each of the tags can or are match up to the 6 pillars of radical influence.

00:54:51.780 –> 00:54:54.860
So you should be able to filter by that.

00:54:55.520 –> 00:55:07.170
and you’ll be able to see a variety of articles from variety of sources you probably would never think of to learn something more about each of the pillars of radical influence.

00:55:08.040 –> 00:55:10.020
So that’s it for this session everyone.

00:55:10.030 –> 00:55:12.360
Thank you very much and we’ll see you soon

Scroll to Top