Say Goodbye to Invasive App Installs!

Great news for all of those people who primarily use mobile to access the internet!

Starting November 1, Google will be implementing important new measures that should change the landscape of mobile browsing. Yes, it is finally the beginning of the end for those invasive app installs that creep behind your mobile browsers.

Google posted an article detailing these new anti-creeping systems on their official Webmaster blog:

“…mobile web pages that show an app install interstitial that hides a significant amount of content on the transition from the search result page will no longer be considered mobile-friendly… As an alternative to app install interstitials, browsers provide ways to promote an app that are more user-friendly.”

Now, app install banners will become the norm as they are more heavily supported by Google.  Is there anything worse than opening up a fresh new page, only to have the entire screen dominated by an app install request?  Luckily, Google is working to get rid of nuisances like that and to make mobile browsing as seamless and user-friendly as possible.

For more information, check out the official Google blog post:

http://googlewebmastercentral.blogspot.ca/2015/09/mobile-friendly-web-pages-using-app.html

Using “Honeypot” Banners to Detect Click Fraud

In our previous article, we discussed the prolific damage fake traffic continues to cause the online advertising industry. To recap: approximately a third of the total traffic transmitted through online ad agencies is fake, accounting for over 15 Billion US$ worth of traffic.

Fake traffic and bots have become an accepted reality for advertisers. Most advertisers set a threshold to account for bots, known as “Maximum Level of Suspicious Impressions.” Said threshold for mainstream traffic is typically set to a maximum of 5%.

However, as online advertising continues to grow and evolve, bots are also becoming more advanced. In the past, bots were much easier to detect: by looking for abnormal behavior generated by a common IP, a common user agent, or alternatively, could be easily tracked by source, time on site, bounce rate, number of new sessions, average pages per visit and other behaviors. Advancements in scamming have generated bots with the startling ability to mimic human reaction. Bots can now open several pages and click on many buttons, simulating a human user. These new bots are much smarter and harder to track.

So what is the best method for an advertiser or a domain owner to identify click fraud? At TrafficTraffickers.com, alongside the traditional scouting techniques, we use the “honeypot approach.” It can be your best solution when CAPTCHA is not an option.

When we suspect fraudulent clicks, we create a banner that is not visible to human users. We design it to fit the context of the web properties that we want to control and give it similar nomenclature to the rest of the banners. Basically, these banners are identical to a normal, visible banner, with one important exception: bots will click on them, but humans will not (with a small margin of error for accidental human clicks.)

This little trick is a great way to find your usual suspects, and to cut down on bots and fake traffic.

The Cost of Fake Online Traffic

Online advertising is a booming industry that continues to grow exponentially. Online advertising accounted for around 50 Billion US$ in 2014 and these numbers will increase dramatically each and every year. There is an international mobilization at the highest levels of the governments to close the gap of the digital divide. The latest example is the US government’s initiative to provide high speed access to Cuba. (Ref http://www.wsj.com/articles/u-s-sets-a-priority-in-cuba-open-internet-1421792275 )

The graph found below does an excellent job at depicting the current trends: (ref: http://www.itu.int/en/ITU-D/Statistics/Pages/facts/default.aspx )

graph1

There are many fundamental differences between online advertisements and conventional advertising platforms such as newspapers and TV Ads, but one difference stands above the rest: the existence of FAKE traffic in online platforms.

When an advertiser places an ad on the cover of a magazine, one can safely assume that the ad in question has been viewed at least as many times as the particular edition of that magazine has been sold.

When it comes to web traffic, however, one cannot make the same assumption, due to the existence of fake traffic.

Fake traffic is generated by bots and programs. This traffic is highly profitable to the select few that operate it, but highly damaging to everyone else.

Fake online traffic accounts for about a third of the total traffic transmitted through online ad agencies.

(ref http://www.wsj.com/articles/SB10001424052702304026304579453253860786362)

If you do the basic math, the numbers are staggering: one third of 50 Billion $ is around 16.7 Billion US$. That’s over 15 billion dollars’ worth of fake traffic being sold and bought and proliferated around the internet.

With this information in mind, it is crucial for advertisers to seek quality traffic over quantity.  Your conversion ratios can be 36% better than they currently are.  Numbers don’t mean anything if they don’t reflect quality traffic.

We developed our ad serving platform http://www.traffictraffickers.com/ with the goal of improving our advertiser’s conversion rates by at least one third. Our focus is not on having billions of daily impressions, but rather to have high quality traffic coming from real users who are genuinely interested in the offer advertised.

To conclude this article, next time you are setting an advertising campaign, focus on quality and conversion over sheer quantity!

Google’s April 29th Update Is Causing Organic Traffic Fluctuations

If you are a webmaster, chances are you have been seeing fluctuations on your Google organic traffic since the beginning of May 2015.

Some of our web properties experienced gains in organic traffic, while others witnessed a significant drop during the last couple of weeks.

In a statement to Searchengineland.com, Google confirmed that updates on its ranking algorithm have been made. The search engine giant did not provide lots of info on the matter, however it said that they changed the way their algorithm assesses quality. (Ref: http://searchengineland.com/the-quality-update-google-confirms-changing-how-quality-is-assessed-resulting-in-rankings-shake-up-221118 )

As technology advances, Google is getting smarter and more efficient. Shady SEO techniques that used to work well in the past, will only get your site penalized today.

Instead of focusing on how to cheat the algorithm, we recommend a better understanding of search engines’ purpose.

What does Google want? Simply put: To understand the search query of the users and present them with the best suitable answer, from the best authoritative source, in the best possible format, and its goal is to show that answer on the top of the results page and at the fastest possible time. Although most webmasters I know want to increase their users time on site, search engines are looking for ways to do just the opposite.

My recommendations: Focus on your clients and user experience instead of worrying about ways to increase your organic traffic. Build a quality website that ads value to users. Gladly, we know exactly how Google assesses quality.

Below are the questions that Google’s crawler is trying to answer when indexing your site:  (Ref: http://googlewebmastercentral.blogspot.ca/2011/05/more-guidance-on-building-high-quality.html )

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Thank you for your time and I hope that this article was helpful to you.