Automatization and AI in the fight against money laundering / session notes
Today and tomorrow (April 23-24, 2020) I am “at” the Budapest Startup Safari. I attended it in person the previous two years and as it is online this year I logged on. The first session I checked was on “Automatization and AI in the fight against money laundering”. I was interested in this topic, because this is one of the fields the company I work for is in. The topic was presented by Oliver Lebhardt, the CEO and co-founder of Complytron. He described and showed not just what his company is doing but the precursor project as well.
Lebhardt has a background in journalism and his previous startup, SourceCodeLeak, was focusing on creating tools to be used by journalists (and others) to “help you uncover hidden business interests based on digital fingerprints.” During the session he demoed a version of the tool, which combines two functions seamlessly: a scraper and a visualization. First, you, the user, give a list of URLs/websites to it, at least four, but not more than a 100. Then his tool scrapes the sites, i.e. pulls down the source code and gathers other publicly available information about them. Finally it displays the results in a spreadsheet format and two differently visualized ways too. These comparison charts can help to uncover connections, similarities and differences between websites. Journalists can use it to draw their own conclusions whether these websites are operated by the same people or entities.
I described this process in such details because I liked it and was already familiar with it. I have been working for G2 Web Services since 2006 and I was involved in a project using similar ideas and tools. We helped taking down hundreds of illegal prescriptions websites. I can talk about it now, because the first “Operation Pangea” was such a long time ago. Here is a summary of what the project was and G2’s contribution to it. It was fascinating to watch today’s presentation of someone else doing something similar to what I was doing a decade ago. Observing the differences and similarities were worth my time.
Then he moved onto what his current company does. It is focusing on the anti-terrorist and anti-money laundering fight. The official description on LinkedIn read: “Complytron helps AML/KYC professionals to fight financial fraud through digital fingerprints. It develops scrapers and network forensic Software as a Service (SaaS) that uses machine learning and deep learning technology.” Let me translate that for you: the company gathers publicly available information about websites, companies and people. This includes scraping lots of public databases about criminal records, violations and similar records.
Then they built a database that is searchable through their websites. E.g. If you want to know whether a person had any illegal issues or was associated with any wrongdoing you can search at this one, centralized place. The most interesting aspect of this came for me when Lebhardt answered a question at the end: his company is using over a thousand vectors, i.e. there are that many aspects of a company/person/url that can be gathered and potentially searched. It is exciting for me from a data scientist perspective (can you imagine the complexity of their database?), from a user perspective (so many things I could learn about the subject of my search) and from a subject perspective (“OMG what is available about me online?”).
I was glad to learn about both of these ventures and see the demos. It is good to know that the tools to fight crime are getting more sophisticated, available, simpler to use, reliable and inexpensive.
Recent Comments