D.I.Y.

How Google’s latest spam update could hurt your music website

For musicians, having an easily accessible website is crucial. However, Google has recently tightened its parameters on what may be spam. Here’s how to make sure your website won’t be swept under the rug.

by Bobby Owsinski of Music 3.0

Many musicians, artists, and songwriters rely on social media for their online presence, but that can be a mistake. Your website is the best place for your fans or potential fans to get info about you because only you control it. You’re not at the whims of the latest update by a social platform. That said, you want your site to be found during a Google search, so SEO (search engine optimization) is just as important as the content on your site. That can be affected whenever Google updates its search algorithm, or in this case, issues a spam update.

According to Google, its latest spam update is what it calls SpamBrain. Their website says, “SpamBrain is our AI-based spam-prevention system. From time-to-time, we improve that system to make it better at spotting spam and to help ensure it catches new types of spam.”

Website Spam

But what does that mean for your website?

Google didn’t give many specifics, but as always SEO companies everywhere continue to test the latest model to figure out what has changed. NP Digital tracked 900 million domains to see the patterns and this is what it found.

A site with poor quality content is considered spam by Google, so although it will show up in search results, it won’t be near the first pages. This is what Google considers spammy content:

  • Thin content – This doesn’t mean pages that don’t have many graphics or a low word count, but content that doesn’t provide much value, meaning once you finished reading you it doesn’t inform you enough about the subject. Above all, Google favors a rich user experience in its search algorithm. A page that has deceiving content does not fall into that category.
  • Poor meta tags – This means tags that were created more for the search bots than real people. Also, a site that uses the same tags for every page.
  • Keyword stuffing – It’s hard to believe that people still do this, but they do. This means that dozens of keywords are added in an attempt to get attention even if the content doesn’t apply. For instance, if you used Taylor Swift as a keyword but there’s nothing on the page about Ms. Swift, that’s stuffing. By the way, the prescribed ratio for keywords is about 1 per 100 words (there are only 2 in this post).

There’s more, but these are the easiest to observe and not be an SEO maven. Use common sense and create pages to be consumed by humans instead of search bots and you should be okay.

Bobby Owsinski is a producer/engineer, author and coach. He has authored 24 books on recording, music, the music business and social media.

Share on: