Forter — which raised more than $50 million in venture capital funding, its most recent a Series C last year — hosted a panel presentation Thursday at Gabriel Kreuther in Manhattan to talk about the topic. The company is working with firms to help them combat fraud online, and is also about to introduce product to help with account takeovers, or ATOs in tech parlance.
Michael Reitblat, chief executive officer and cofounder of Forter, said before the panel discussion that his firm through its data checks had noticed an increase in online account activity shortly before it was disclosed that last week that credit reporting agency Equifax had been hacked. ATOs occur when fraudsters get access to a consumer’s online account at a merchant and then take it over, using the financial information attached to the account to buy merchandise and even change passwords and other “confidential” information so they can continue to use the account. Also at risk once they gain access is the use of points, the electronic currency consumers accumulate through past purchases, associated with the account.
Reitblat said it is still too early to know the extent of the “scale and scope” of the Equifax hack, adding that “all the data is not being used yet.”
Olivier Hepner, the operations point person at delivery.com, noted that there are on average about 250,000 tries at hacking into an account every 15 minutes.
Panelists include Jed Kleckner, the ceo of delivery.com; Ovadia Labaton, vice president for strategy and business development at Kidbox, and Matthew Arthur, customer experience manager at Malin-Goetz.
Kleckner, a Forter client, said many firms still use manual systems — read human intelligence — to identify between good and bad accounts. He noted that the manual systems are essentially rule-based, and that its one chief flaw is that it can “make any system look good even though it is bad.” He also noted that one way to counter the use of points in case an ATO occurs is to put a time limit on their use, but said that also has the effect of “worsening the customer experience.”
Labaton’s Kidbox is the kids version of StitchFix, a model he described as a “loan” to parents because its sends out curated items before payment is made. That sets up a scenario that is subject credit risk, and the company needs to identify which users intend to pay, which ones are good customers but might have a temporary blip on their card that prevents use and which ones intend not to pay for the merchandise. He also spoke of how his company used to delay shipment and wait for a reaction to curtail fraudsters, but found that backfiring because irate consumers would take to social media to complain, which ended up hurting the business. Forter doesn’t share data, and Labaton said its aggregation of data makes the entire data checking network more efficient, one that he said his firm prefers because it decided it is “better not to have fraud in the ecosystem.”
Arthur said that before becoming a Forter client, he used to Google customers’ names to check publicly available information such as LinkedIn accounts to make guesses on whether or not to approve and ship orders. He spoke about consumers who would place orders over the weekend and expect immediate shipment, and then find out on Monday after he’s had a chance to do his check that shipment was never made. “That’s not a good customer experience,” he concluded. Relying on Forter’s algorithms, Arthur said there’s since been situations where he might have otherwise rejected an order, but allowed it to go through on Forter’s recommendation, only to find out later that those customers have since placed replenishment orders.
Reitblat said AI technology, which has many benefits, is still in the early stages, and likened it to a “very smart three-year old.” But there was also talk of how a retailer – using purchasing behavior by a teen, including a pregnancy test — began sending the teen baby-related advertisements, only to have to deal with an irate parent who didn’t know the teen was pregnant.
Labaton noted that management needs to be involved in the conversations with the tech team on how to best use AI to “make sure there’s some common sense behind it.”