Updating Your Bank’s Security Training for the Age of AI

by

Jeremy PogueHow much could AI-driven models like Copilot for M365, Google Gemini, or Apple Intelligence improve the productivity at your bank? The jury is still out on that one, but initial experiments place the overall AI-driven productivity gains for the US economy at between 8 and 36 percent, according to a recent release from the Federal Reserve Bank of St. Louis. Researchers are quick to point out that these gains may take years to realize, much like the productivity gains we saw from the introduction of computers themselves.

So, does this mean you should let your bank take a “wait and see” approach with the tech, only changing the way you do things when you decide to invest in AI tools?

The answer is most definitely no. No matter when you plan to formally introduce AI platforms on your systems, it’s critical that you begin training your employees to think differently about data in the AI age. They’ll need the right kind of bank security training to know how to stay safe against data loss and the coming tsunami of AI-enabled hackers.

 

Bring Your Own AI Policies? We’re Almost There. It Will Change Everything About Your Bank’s Security Training.

Chances are good that your employees are already using AI. They’re asking Chat GPT to write that tricky email. They’re using AI subscription services to create art or writing or to provide data analysis. App developers are lining up to create AI-driven convenience that’s available right on their phones.

With the announcement of Apple Intelligence and Chat GPT integration in the newer iPhone models, employees can use those engines to parse work photos, documents, messages, and conversations, especially if they’re working from home.

You may or may not be deciding to implement AI on your systems. But make no mistake, employees will bring their own AI into the workplace, like it or not. The time to get your arms around this cyber security risk is now.

Get our free AI policy template for your business Your team is using AI tools like Copilot and ChatGPT to handle work. Make sure usage is ethical and secure with our free AI Acceptable Use Policy Template.  

 

Four Key Ways AI Will Change Your Bank’s Security Training Strategy

Best practices for AI cybersecurity are still evolving, and no one has all the answers yet. However, there are some key bank security training strategies you can implement now that will help prepare your employees for the AI-enabled future. While I’m sure more guidance will come from the cybersecurity industry, here are some basics every bank should consider now.

 

#1—Get Buy-In for Your Bank’s Cybersecurity Training with a New AI-Fair Use Policy

As more employees get AI tools embedded in the tools they bring to work daily, having written rules about responsible AI use will be crucial. Specifically, your AI written use policy should include instructions on:

  • Not downloading company information into AI generative models
  • Not using embedded AI tools in your phone to search or train on company documents, conversations, emails, or messages
  • Not taking photos which may include monitor screens or protected data
  • The definition of “protected data.”
  • Not using AI to create photography or illustrations in which copyright hasn’t been explicitly given.

 

#2—Make Conditional Access Mandatory and Teach Your People to Think of Their Information Differently

Training should begin now to prepare for “conditional access” to your documents. As a financial institution, you already have safeguards around client financial data through your core providers. However, more thought needs to be given to the types of documents that may one day be searched and summarized by your AI tools.

Fortunately, M365 with Copilot will include a mandatory process for naming who will access the files you create. This will create “access levels” that limit who can view, search, or summarize documents in your system.

Now is the time to tell your employees to remove all old files and bad, secondary versions. Then, when conditional access becomes available, training for this new process will be much easier.

 

#3—Announce Your PEN Testing and Get Staff Involved in the Effort

Penetration testing, or “PEN testing,” often called, is a coordinated mock cyberattack protocol designed to catch your employees clicking on wrong links. When they take the bait, they are usually given additional cybersecurity training. This is great, of course, and a required part of keeping your bank compliant and safe.

In my experience running these, though, at least ten percent of your employees will always fall prey to the fake hack. Most companies don’t tell employees in advance about the PEN test, and then when they are caught, they are made to feel like criminals.

I recommend a different and more effective approach. Instead, let your employees know in advance that the test is taking place. Tell them the kinds of challenges coming their way, and enlist their help to “find all” the hacks and report them. Take time to up your training before the test begins, and you’ll see better retention and results.

 

#4—Drill Down the Importance of Multi-Factor Authentication

For employees working from home or traveling between locations, it can be tempting to email documents to their home computer. AI tools now semantically read everything on your devices and your photo roll.  It’s critical that your employees ONLY work through devices protected with your two-step verification process. Your system must be backed up by a zero-trust framework that continuously verifies them as they work.

You may have covered this before, but with AI in the picture, your MFA/zero trust frameworks need to be strengthened. A big part of that is renewing your training on the issue.

 

Interested in Improving Your Bank’s Security Training and Communications?

Integris offers a full suite of online, graded, and tracked employee cybersecurity training awareness programs—perfect for meeting regulatory requirements. If you’re interested in setting them up at your bank, contact us for a free consultation.

Get our free AI policy template for your business Your team is using AI tools like Copilot and ChatGPT to handle work. Make sure usage is ethical and secure with our free AI Acceptable Use Policy Template.  
Jeremy Pogue serves as Director of Security Services at Integris.

Keep reading

What Can Cybersecurity Awareness Training Do for My Company?

What Can Cybersecurity Awareness Training Do for My Company?

Global spending on employee cybersecurity awareness training is predicted to exceed $10 billion USD by 2027, up from around $5.6 billion USD in 2023, according to the latest estimates from Cybersecurity Ventures. Why? Because more companies than ever are realizing...

Everything You Need to Know About Microsoft Copilot

Everything You Need to Know About Microsoft Copilot

Microsoft Copilot is easily one of the most hotly anticipated tech advancements to your desktop in decades. Does that sound like a bold statement? I assure you, it's not. For companies that buy into Microsoft's new, proprietary AI engine, Copilot will be woven...