The Oil & Gas Journal, first published in 1902, is the world's most widely read petroleum industry publication. OGJ delivers international oil and gas industry news; analysis of issues and events; practical technology for design, operation, and maintenance of oil and gas operations; and important statistics on energy markets and industry activity.

OGJ is edited to meet the needs of engineers, geoscientists, managers, and executives throughout the oil and gas industry. It is part of Endeavor Business Media, Nashville, Tenn., which also publishes Offshore Magazine.

Endeavor Business Media’s Petroleum Group also produces targeted e-Newsletters; hosts global conferences and exhibitions, seminars, and forums; and publishes directories, technical books, print and electronic databases, surveys, and maps.

Additional Information

Website & Technical Help

For help with subscription purchases or refunds, or trouble logging into the paid subscription content on www.ogj.com, please contact Customer Service at [email protected] or call 1-847-559-7598.

For more customer service information, please click here.

Tips for Protecting Yourself from Deepfake Fraud

photo

SPONSORED CONTENT -- (StatePoint) You answer the phone and hear a familiar voice, but are you sure you know who it is on the other end of the line? The correct answer should be “no.”

Rapid advancement of artificial intelligence (AI) has armed bad actors with sophisticated tools to enable impersonation fraud using deepfakes. A deepfake can consist of audio, video or imagery that has either been created or altered using AI. The danger is that with a simple sample of audio or video or even a few images, a criminal can create a deepfake that is almost impossible to detect.

A National Institutes of Health study in 2023 found that even when individuals were given a warning that one out of five videos for them to review was a deepfake, only 21.6% were able to correctly identify the fraudulent option.

In today’s technological climate, the risks are too high to trust the naked eye to determine truth. Luckily, there are certain things you can do to help reduce your likelihood of getting duped, and they don’t require specialized skills or technical aptitude.

One important one is never assuming that someone contacting you is legitimate, even if they share a seemingly legitimate image or video, have a familiar voice, or come under the guise of someone known and trusted. Creating safe words with close family members and friends is also a good idea. That means you have a secret code word only they know. If the caller doesn’t know it, that is an easy and effective way to identify a fraud.

Just as important if not more than picking out deepfakes, is protecting your identity against being exploited to create a deepfake. Here are some measures that could help mitigate the risk:

Social Media Management: Don’t over-post your face or voice and limit who can view your content by tightening up privacy settings on your accounts.

Watermarks on Images: Consider putting watermarks on imagery you post online to discourage repurposing of it.

Stronger Identity Protection: Subscribe to identity monitoring services that will send alerts if your personal information appears on the dark web where criminals buy and sell stolen credentials.

Limited Voice Exposure: Screen phone calls and only answer when they come from a known number. It’s also recommended to use the factory setting voicemail message rather than recording an outgoing message using your own voice.

Prompt Reporting: If you are contacted by someone you believe is leveraging a deepfake to impersonate a person or brand, report it immediately to that person or brand. They will want to work with the authorities and time is of the essence.

To learn more about tools and techniques to help protect your identity and respond to fraud, visit PNC’s Security & Privacy Center.

Technology is providing criminals more sophisticated technologies to commit fraud, but good, old-fashioned caution and common sense are still the best protection, and that applies to deepfakes, too.

Photo Credit: (c) Jose Calsina / iStock via Getty Images Plus

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms Of Service.