

Last week, I was tagged in an Instagram post. Thinking nothing of it, I clicked on it and found myself staring at a video of my own kitchen. “Huh,” I thought, “why would someone repost this video of my kitchen?” Then I clicked on the profile and saw something much worse.
The account had a generic woman’s name. The profile picture was of someone else. But every single photo was me. Photos I had posted to Instagram last week–photos I had posted 3 years ago. Photos of my outfits, photos of my lattes, and even a photo of my cat. It wasn’t every single Instagram post I’ve made in the last 3 years, but it was a healthy smattering, totaling 260 posts.
Not to worry, I thought, I’ll just report this for impersonation! Instagram, after all, has a button for that. Within 30 seconds of my report, I got a notification. Instagram did not find that this account violated its guidelines. The account stayed up.
When I said I wasn’t worried, I was lying. Instagram and Meta customer service is nonexistent, and many people have noted that concerning posts, including fraudulent ads, are rarely taken down.
I decided on a deluge approach and took to my Instagram stories, asking people to report the account. The most common question I got in reply was “why would someone do this to you?” If you’ll indulge me, I think this question is worth answering here, with some (very slight) insider knowledge that comes from being a content creator:
It’s all fake
When people asked why someone would do this to me, I thought that was likely not the question to be asking, because I highly doubt an individual human woke up one day and decided to steal all my photos. Instead, this account was probably a bot, like an estimated 95 million accounts in the last year I could find major sources on (2018!!!). In fact, TikTok commenters have started latching onto the “dead internet theory,” which claims that most interactions online are between and among AI. This theory seemed far-fetched a couple of years ago, but now feels possible as AI content has matched the level of human content online. Similarly, X just rolled out a new feature only to immediately remove it after it revealed a major bot problem.

Buying bot followers is cheap — there are hundreds of websites offering the service. In general, I assume that every company account I view has a follower list primarily composed of bots. Because Instagram used to require users to have 10,000 followers to add links to stories, it seemed like standard practice to start your company’s IG, buy 10,000 followers, and go from there.
If you’re freaked out at this point, sometimes bought followers are obvious, just compare the number of followers to the views on an account’s Reels. If there’s a significant disparity, that’s usually a clue (e.g., 20,000 followers but every Reel is getting 1,000 views).
But why?


Still, I wondered why someone would steal my photos — was the account posting comments? Sending spammy DMs? There’s a chance that this account was just being used as a purchasable follower. Many services tout the ability to deliver “real, active followers” that avoid detection on Instagram. Posting my posts would certainly make an account look active.
But a friend in tech explained to me that there’s also a vast black market for selling and buying Instagram accounts. He thought it was most likely that some program was scraping all my images, uploading them to an account, and doing the same across many other accounts. When the accounts get to a big enough following size, they can be sold and turned into spam or porn accounts. Here’s a pretty creepy how-to post about selling an Instagram account. This is an issue that Meta is finally going after, but it’s also not totally blameless in this scheme, as some employees have been bribed to verify fake accounts (if you only click one link in this post, this was worth a read!).
The face staring back at you
You know when you would clean or rearrange your room as a kid, and then step out of the room and step back in to view it as someone else would see it? It turns out that having your photos stolen creates the same effect.
I knew that this was not some human manually selecting what photos to repost, still, I found myself intrigued by the “choices”. There are only about 4 photos of me in a swimsuit across my entire page, all 4 of them made the spam account. My wedding photos were there, plus photos of my husband and I on 2 vacations. Did swimwear and a happy relationship sell? Should I be playing more to those categories?


Even better, the account never stole my captions. My tech friend said this would be one way to avoid detection as an impersonator. Instead, they were all replaced with (I assume) AI drivel that was both hilarious and delusional. Personal favorites above. I wondered: was it time to be sassier online?
But what I really learned is what content creators have known for a long time: Instagram doesn’t care about you and will only help you if you pay $15 a month for a blue verified checkmark. Even after shelling out the money, it took me three different chats with Meta support to get the account removed, even as each support agent agreed with me that this was clearly impersonation.
I should probably have a stronger conclusion for you, but I don’t! When you don’t own the platform you create on, you are subject to its whims. When that platform is also where a lot of us get news and form our opinions, the news is much, much worse. I’ll still be over there, I guess with my new blue checkmark as a flimsy shield.