OpenX Overtakes Google at Top of ‘Trust Index’

Botnet

OpenX has overtaken Google as the most trusted inventory source, according to figures released this week, as the latter of the pairing aims to underline its leadership in the industry’s fight against dishonest players.

The claims were made in the latest update toPixalate’s Global Seller Trust Index (GSTI), which ranks inventory quality and security risk, with the company further revealing that 70% of media inventory sources are exposed to malware-driven ad fraud, with one-in-20 internet users infected by it.

Overall, OpenX ranked as the most highly-rated source of traffic in April, overtaking Google AdExchange, which had led the table the previous month (see chart immediately below).

PixalateGSTIApril

Meanwhile, both OpenX and Rubicon Project are leading in five categories each, while Google’s AdExchange leads in two (see chart below).

Commenting on the numbers, Jalal Nasir, Pixalate, CEO, said addressing the issues of cyber security risks, plus the concerns they pose to both enterprises and consumers, was crucial if high-level marketers are to place further trust in programmatic media buying.

The complexity of the programmatic advertising sector means it is particularly vulnerable to malicious players in the ecosystem, as it gives them a means to hide their true identity, according to Nasir.

He added: “This is a complex and growing problem, as many buyers, including reputable brands, purchase seemingly legitimate inventory. Unbeknownst to them, some of the inventory has been compromised and ultimately leads to a negative impact on consumer trust and brand integrity.”

Pixalate’s GTSI analyses 100 billion ad impressions, across 350 million IP addresses in order to benchmark 400 programmatic media sellers by assessing their vulnerability to darknets, owned and operated by malicious organisations. The latest version of the report also revealed that breaks sellers down by IAB verticals (see chart below).

PixalateGSTICategories

Google’s efforts to combat fraud

Meanwhile, Google has been at pains recently to demonstrate that it is at the vanguard of the drive to clean up ad tech, post its purchase of Spider.io last year.

The advertising behemoth recently granted industry journal Ad Age access to its 100-strong anti-fraud team in London to offer the industry further insight to its efforts to combat botnets, which are credited as siphoning off $6.3bn from the digital advertising sector each year.

The piece demonstrates the ease with which hackers can hijack machines – through security weaknesses it calls ‘exploits’ – to create botnets that then click on ads en masse, and further profiles Google’s efforts to combat the fraudsters behind them.

More recently, Google used its DoubleClick Advertiser blog to issue a call to action in a post entitled: Stopping Digital Ad Fraud.

Penned by Vegard Johnsen, Google, ad traffic quality, product manager, the piece echoes Google’s backing of the IAB’s efforts to standardise industry jargon around such practices.

He added: “When fraud is identified it should be shared in a clear structured threat disclosure, mirroring how security researchers release security vulnerabilities. By increasing the amount of data we share in a transparent, helpful way, others in the industry will be able to corroborate any claims being made, remove the threat from their systems, removing it from the ecosystem.”

Johnsen goes on to advocate a system whereby any party that purchases non-blind impressions should be passed a chain of unique supplier (and reseller) identifiers – be it an exchange, network, sell-side platform – and one for the publisher.

He added: “With this full chain of identifiers for each impression, buyers can establish which supply paths for inventory can be trusted and which cannot.”

Botlab.io

Mikko Kotila, industry veteran, and author of last year’s WFA report advising brands on transparency in the programmatic buying sector, recently dropped by the ExchangeWire office to discuss his not-for-profit outfit Botlab.io, aimed at helping clean-up the sector.

In the TraderTalk TV episode below, Kotila discusses his view that independent bodies are better placed to combat fraud, compared to security firms or trade bodies, as the latter are incentivised for profit.

Click below to see Kotila explain some of the different types of fraudulent traffic generators on the market at present:

WHAT IS HOLISTIC AD SERVING?

Certainly one of the biggest opportunities in ad tech today is integrating real time bidding (RTB) systems to core ad serving platforms such that ad serving decisions are made from a single system. This vision of a fully integrated monetization stack is known as holistic ad serving, and it’s going to be big.

Holistic ad serving consolidates what is today a fragmented marketplace, modernizes the publisher ad serving stack, and lays the groundwork for advertisers and publishes to transact guaranteed campaigns over RTB infrastructure.  In other words, it provides a way for publishers to transition from a world of manual campaign implementations to accepting and trafficking campaigns programmatically without having to manage the balance between two systems.

Tactically, holistic ad serving is a seems like a basic change – instead of filling direct campaigns first and then letting the exchange try to fill whatever is left, the idea is for publishers to call to the exchange marketplace and get a bid for every single impression, thereby allowing RTB demand to compete directly with the traditionally sold campaigns with guaranteed goals.  By at least getting a bid for every impression, the publisher’s ad server can understand the benefit or cost of filling an impression with a direct campaign – it has all the information.  Holistic ad serving also opens the possibility, on an impression by impression basis, for an RTB campaign to trump a direct campaign.

Yield Overlap Demands System Integration

Holistic ad serving represents a major shift in the way publishers currently integrate with ad exchange demand, though in a good way. Today publishers use a core ad serving technology like Dart for Publishers (DFP) or 24/7’s Open Ad Stream to manage their directly sold campaigns and then redirect to a supply side platform (SSP) like Rubicon Project or PubMatic to manage the exchange and indirect demand on their inventory, tethering the two systems together with a 3rd party tag.  The problem here is that the publishers have made a big assumption with this setup: that the yield from an SSP will never exceed the yield from their direct business and it always makes more sense economically to serve a directly sold campaign than an indirectly sold campaign. In prior years before RTB marketplaces existed, this assumption was pretty sound. Ad networks could rarely compete with the rates the publisher charged when dealing direct to advertisers as their business model was fundamentally different. Networks sold efficient reach while publishers sold targeted frequency; you’d never expect the former to pay more than the latter. In the world of RTB however, this assumption makes less sense by the day. As advertisers are able to define their own targeting, programmatically optimize to goals, and trade network markups for a DSP’s ‘cost plus’ pricing there are more and more cases where an impression is worth more to an exchange buyer than a publisher’s directly sold campaign.

To be clear, premium RTB rates are still the exception, but the overlap is growing and it will soon represent a not insignificant piece of the pie for many publishers. As a publisher today, when you look at a yield histogram from your RTB demand, you see something like the below, with a large block of demand clearing at very low prices, probably on or near your floor price.

But from there it’s a long tail of higher yielding demand, with a small segment at very premium rates that almost certainly earn more than a publisher’s lowest priced direct deals. And if publishers are paying attention to this kind of reporting, the overlap of bids that exceed directly sold eCPM is likely growing each quarter. The opportunity then is that if a publisher were to make a bid request to the exchange for every impression they could take more premium bids overall, and they could also use the inventory the exchanges didn’t want to fill their directly sold campaigns, reducing the number of low yielding RTB bids.

Take the hypothetical scenario below – the publisher has directly sold 30% of their inventory, and is monetizing 70% of the inventory through an SSP. If the publisher has a floor of $1.00, their SSP might only be able to fill half of the inventory, with the rest ending up on Ad Sense, or another performance network of last resort, which likely monetizes at a very low rate. Given this scenario, the publisher winds up with an effective yield of about $5.00 on their direct business and about $2.00 on their indirect revenue at a fill rate of about 50%.

But – if the publisher were to implement a holistic ad serving solution they might find their fill in any given tier from the exchanges is more a function of the inventory they make available, not an absolute impression level, as their direct line of business works. In that case, the inventory they were able to monetize at $6.00 on the exchange ends up being about 4% of whatever they make available, which means they fill more impressions at that price level which would have been used to support a direct campaign. Now that the ad server knows the yield from every source for every impression though, it can serve all $6.00 indirect demand over the $4.00 and $5.00 direct demand.

To logically play that decision out across all impressions means the ad server will likely push all the direct demand into the capacity that had been used for the performance networks. Which brings up one of the potential risks of moving toward holistic ad serving – which is that by allowing exchange cherry picking, publishers risk a system that shifts their higher-yielding inventory to lower performing impressions, likely later in a user’s session, and probably in ad slots below the fold. From a monetization perspective though, the impact is pretty positive. Even if the SSP’s yield remains flat, the fill rate is likely to increase, resulting in an increase in overall yield – in this example the gain is equal to about 13% in found money. Not much from an absolute point of view in our example, but for any business that throws off a considerable amount of money on their indirect line of business, and many publishers do, a double digit gain in yield is tremendous.  The more system yield histograms overlap, the more opportunity there is for a publisher.

Technical Workflow – No Easy Solutions

It isn’t all that difficult to see why a publisher might want to move toward holistic ad serving, but the next question is how exactly they go about doing it. The SSPs are doubtlessly all working to devise a solution, but few seem to be out in market with a product just yet.  The two exceptions I’m aware of today are OpenX’s integration of their Enterprise Ad Server with their exchange optimization platform and recent acquisition, LiftDNA.  You have to be on both products to take advantage of the seamless integration, but it does purport to be a working holistic ad serving solution.  For publishers not on OpenX and hesitant to change, LiftDNA actually has an interesting standalone product, which cleverly works through ad server APIs to traffic and constantly re-prioritize exchange demand as separate placements in the publisher’s ad server, and DFP’s one-click opt in to ‘dyanmic allocation‘ with the DoubleClick Ad Exchange.  The standalone LiftDNA product benefits from being a truly open solution that can work with any indirect source of demand, but it doesn’t truly evaluate exchange yield on an impression by impression basis like it can when working through OpenX’s ad server.  Google’s ad server DFP does pass exchange yield impression by impression, but only from their own exchange, not any others. And while I have no reason to think the product doesn’t serve publisher interest alone, some might say that because DoubleClick is part of Google and Google also owns advertiser facing products like AdWords and InviteMedia means their publisher facing yield management product inherently has a major conflict of interest.

While it seems like a simple concept – just make a bid request for every impression – integrating ad servers with SSPs is actually a tremendously complex task.  Holistic ad serving moves the ad serving decision from one system to many, and the key concern is how to go about that without adding huge amounts of latency to the process.  In my mind there are two basic ways to integrate these technologies, and neither is ideal.  The first would be to push the exchange intelligence into the ad server by making a bid request to the exchange in a site’s header code, and then populating the highest bid value into the ad server using a key value parameter.  The downside to this approach is that it adds a race condition to the page, and makes the page wait to load the publisher’s ad tags until the exchange responds.  Any solution in this direction needs a reliable way to abandon or timeout the request to the exchange after a certain amount of time so the page can continue to load.

The second and superior option would be to connect the systems via API, and have the ad server cookie sync with the SSP.  That way, the publisher wouldn’t have to wait to make the call to the ad server, they could do that straightaway and then let the ad server abandon the exchange request from within their own system.  Regardless of the setup, the technical challenge in both cases is that holistic ad serving introduces a race condition and now has to wait on a 3rd party system in order to make a decision.  Even if the ad server has a way to timeout a response from the SSP after a certain amount of time, it’s almost impossible to think an SSP to ad server integration doesn’t add latency to every impression.  This isn’t necessarily the SSP’s fault either, since it has to wait on DSPs and the exchanges to respond to its requests, but it’s a problem nonetheless.

 

Source: adopsinsider.com

WHAT IS A CACHE BUSTER AND HOW DOES IT WORK?

A cache-buster is a unique piece of code that prevents a browser from reusing an ad it has already seen and cached, or saved, to a temporary memory file.

What Does a Cache-Buster Do?

The cache-buster doesn’t stop a browser from caching the file, it just prevents it from reusing it. In most cases, this is accomplished with nothing more than a random number inserted into the ad tag on each page load. The random number makes every ad call look unique to the browser and therefore prevents it from associating the tag with a cached file, forcing a new call to the ad server.

Cache-busting maximizes publisher inventory, keeps the value and meaning of an impression constant, and helps minimize discrepancies between Publisher and Marketer delivery reports.

What Does a Cache-Buster Code Look Like?

Typically, a java script function like the one below powers a cache buster. An example of a cache buster looks like this:

<script type="text/javascript" language="JavaScript">
ord=Math.random()*10000000000000000;
</script>

This code is put toward the top of the page within the site’s <body> tag and creates a random number for the “ord” value in the ad tag. So, when a browser hits a tag, it builds the ad tag like this –

http://ad.doubleclick.net/ABC/publisher/zone;topic=abc;sbtpc=def;cat=ghi;kw=xyz;tile=1;slot=728x90.1;sz=728x90;ord=7268140825331981?

If the browser then returns to the same page later on, the same tag might look like this, where everything remains the same except for the random number.

http://ad.doubleclick.net/ABC/publisher/zone;topic=abc;sbtpc=def;cat=ghi;kw=xyz;tile=1;slot=728x90.1;sz=728x90;ord=6051834582234?

Why Does a Browser Cache in the First Place?

When a browser navigates to a web page today, the Publisher’s Content Server sends it an HTML file with instructions on how to format the page and where to retrieve all the images, text, and other pieces of the page.  Downloading this information all takes time and memory to accomplish for the browser, so it tries to save as much of the information as possible for future use in temporary folders (the cache) on a user’s hard drive.

This technique lets a browser surf through a website much faster.  It’s less important in an age of high speed fiber optic connections, but made a huge difference in the days of 56K modems, when each page took seconds if not minutes to load.  And, since most web pages are built on templates, many elements of a site are used on every page, for example, the site’s logo.  Why fetch the same image again and again when the browser can save it once, and simply reference the same file on every page? The browser is smart enough to read the HTML code for each page and recognize what content it already has and just skip to the next line of code to look for the unique and previously unseen data.

This would certainly work for the ads on the page, too.  If a user loaded a publisher’s homepage for example, then went to an article page, then back to the homepage, the ad tag would be exactly the same and the browser would just re-use the ad it called the first time if a cache-buster was not implemented. Since Publishers get paid for every impression though, they don’t want this to happen, they want the browser to call or consume another impression so they can charge for it. Advertisers might like the idea of free impressions, but when pressed, most would tell you cache-busting is a good thing for them, too. Recycled ads screw up reports, mess with ROI calculations, and add an uncertainty factor to campaign data, not to mention create tension between Publishers and Advertisers via discrepancies.

In fact, if you are having an issue with 3rd party discrepancies where the publisher numbers are much higher than the advertiser numbers, the first thing you should check is that a cache-buster is in place and working.

Some Interesting Facts About Online Advertising

Here are some really interesting facts about the online advertising, Enjoy!.

  1. Video ads account for 3% of time spent viewing video online
  2. Video ads accounted for 31% of all videos viewed online
  3. The average US online video ad is 24 seconds long
  4. Display ads account for 0.9% of upstream traffic to department store sites
  5. Brand marketers will account for 27% of online display ad spending by 2018, down from 31% in 2011
  6. Brand marketers account for 33% of all online display ad spending, down from 48% in 2006
  7. 8% of US moviegoers watched film previews online via game consoles in 2012, up from 4% in 2010
  8. 15% of US moviegoers watched film previews online via smartphone in 2012, up from 6% in 2010
  9. 8% of US moviegoers watched film previews online via tablets in 2012, up from 5% in 2011
  10. 76% of consumers in the US and UK say they receive more marketing messages containing customized offers or invitations than they did 5 years ago
  11. 28% of consumers in the US and UK want to receive personalized marketing messages which include recommendations for specific products
  12. The internet accounts for 26% of US consumer interaction with media, and 22% of advertising spending
  13. Mobile devices account for 12% of US leisure time, and 3% of advertising spending

Source: http://www.factbrowser.com/tags/online_advertising/

How ad serve targeting GEO location

In today’s digital ad market, geotargeting depends on mapping a user’s IP address to a physical location, a task every ad server outsources to my knowledge.  This is because the process of assigning a geographic location to an IP is messy and complex to say the least.  Just because the ad server outsources the functionality however doesn’t give Ops an excuse to ignore this important and highly utilized feature.

How is an IP Address Associated with a Geographic Location?

By and large, IP addresses are arbitrary – meaning they could be anywhere, and there isn’t much rhyme or reason to their values from a geographic perspective.  It isn’t as though if the IP address starts with a 1 it is always located in the United States, for example.  Instead, companies like Digital Envoy use a multi-layered approach to assign geographic qualities to a user, some highly technical, and some which are just common sense, and some that are a combination of the two.

On the common sense side, a fair amount of geolocation companies can leverage Regional Internet Registries, or RIRs, to assign high level qualities, like country or continent.  The RIRs each own dedicated ranges of IP values and exist to allocate IP addresses within their regions, and cooperate among each other to ensure that the same IP isn’t being used in more than one place. So placing the IP address within a specific RIR’s range allows the service to identify location at a very high level.  Some geolocation services are rumored to work with large registration based sites as well, and have zip code information that a user might manually enter during a sign up process.

Pings, Traceroutes, Reverse DNS, and Other Technical Methods of Geolocation

From there though, the heavy lifting is usually done through a combination of three technical processes known as pings, traceroutes, and reverse DNS lookups.  Let’s run through a high level explanation of all three processes, and then explain how they work in concert to geographically locate a single IP address.

A ping is just a small piece of information sent from one computer to another, with a request to call the originating computer back.  Pings can also record the round trip time of the journey, and are used for a variety of administrative network processes.  Think of it like a submarine’s sonar technology, applied to the internet.

Tracerouting is basically a way to record the network routing process of the ping service, or the detail behind how the ping got from one machine to its destination.  Tracerouting records how a ping is routed, who it is routed through, and the time it takes at each step.  When information travels across the internet, be it a ping or just regular surfing, it moves through a series of very high speed fiber optic networks owned by various public and private entities.  Now, when the information gets physically close to a user, it passes down to an Internet Service Provider (ISP), which sells internet access to consumers.  The ISP eventually moves the packet of information to a nearby network router to the user, which connects directly to the user.  By using the traceroute utility, the geolocation service can know every system the information was passed through in order to get to its final destination.  The important piece of information the service gets from a traceroute is the IP address of that final network router, geographically nearest to the user.  You can ping or see the traceroute command in action on your own machine at Network Tools.

With the network router’s IP address in hand, the geolocation service can finally use a technique known as a reverse DNS lookup to identify who owns that network router, which it can use to lock in on the physical location of the user.  Reverse DNS is simply a service to identify the hostname of an IP address, that is, who owns an IP address.  For many home computers, the host ends up being the ISP.  For businesses, the host ends up being the company’s domain. DNSStuff provides a reverse DNS lookup service – just enter an IP address into their ‘IP Information’ tool to try it out.

Geolocation in Action

Now that you understand the basic approach, here’s how it all works together at a high level –

When a geolocation service wants to triangulate an IP, it starts by pinging that IP address from a central server it owns, and then looking at the traceroute.  From the traceroute, the service can identify the nearest network router to the user by IP, labeled point A on the diagram below.  Then, using a reverse DNS lookup, the service can find out which ISP owns that router, and then query the location from public data, the ISP itself if the service has a business relationship in place, or failing that, triangulate the location with the process below.

In all likelihood, the geolocation service already knows the location of this network router, either by working with an ISP directly, or through previous triangulation efforts.  With that location in hand, the geolocation service hands off the triangulation process to servers closest to that network router, of which it also knows the exact geographic location.  Now, the service sends a ping from at least three of its own separate servers (1, 2, 3), and records the time it takes to reach the user.  Only time can be recorded from a ping, not distance, but using time as a radius, the geolocation service can draw a circle around each server, and know that the target location must exist at some point on the arc.

Geolocation by Ping Triangulation Explained

With three separate locations, the target location should exist at the one point where all the arcs meet, which also gives the service the exact vector to the target from each server.  And, since information runs through fiber optic cable at a known, constant speed (about 2/3 the speed of light), the service can now translate that time into a distance, and with the vector and a known server location, calculate the exact location of the target, within a certain margin of error, depending on the exact method used, and how many points of triangulation are employed. Currently, the most advanced geolocation triangulation methods employ as many as 36 points to eliminate problem data and increase accuracy, and can accurately map an IP address within 700m – but we’ll talk more about that in the final piece in this series.

Network Maps & WHOIS Lookups

Using either piece of information, the ISP or the business domain, the geolocation service can further refine the geographic values of a given IP.  Geolocation services may also work directly with ISPs to get the general physical location, when available of a given IP, since the ISP will know the exact address of the customer using that connection at any given time.  It’s important to note that no PII is exchanged in that process, a zip code is just mapped to the IP address, and not all ISPs participate, or may simply provide the location of the final network router instead of the end-user’s zip.

Some of the more sophisticated geolocation services may be able to deduce the physical location of an ISPs network routers, also known as the ISP’s network map, by pinging those routers from various servers with known geographic locations, measuring the time it takes to get a response, and using that information to triangulate the router.

Businesses may also have a specific address, available through a WHOIS lookup, which allows country, state, city, and zip to be assigned.  The WHOIS directory is a public registry of who owns what domain, along with their name, and importantly, address.  Through this information, geolocation services can get a better idea of the physical location of each machine.

Where Does Geolocation Data Come From?

In most cases, a 3rd party table from a company that specializes in geolocation data.  Practically speaking, most of the advertising industry relies on a small company called Digital Envoy, founded in 1999 by a few smart entrepreneurs, and was acquired by a larger media company called Dominion Enterprises in 2007.  Digital Envoy pioneered the process of linking an IP address to a geographic location, and specializes in keeping the information current, and accurate.

Effectively, Digital Envoy maintains a massive table of literally billions of IP addresses and their inferred geographic qualities, and then sells access to that table at various levels of granularity to ad servers and lots of other companies who have an interest in identifying the location of a user, an ad server for example, who then cache the information in their local database, and can run queries against it.

Other companies that perform this service include QuovaMaxMindGeoBytesCyscapeIP2Location, andAkamai’s EdgeScape product, though there are also free services out there such as HostIPIPInfoDB, andSoftware 77.

[This article was originally published on Run of Network in Dec of 2011]

How To Optimize Your Website’s Performance

By Mike Quinn, president, Yellow Bridge Interactive (YBI)

Web pages and websites are getting bigger and becoming more complex every day. But when a website does not load quickly, it affects visitors’ behavior, which leads to decreases in sales conversions and revenue.

A website can slow down for a number of reasons, including low server memory, competing resources or data influx. If a web server is slow, it will hinder the website’s performance. Likewise, a site receiving a great deal of traffic can also slow down load times or disrupt a visitor’s experience entirely. Navigation, site design, images and apps can also affect how quickly and effectively a website is displayed.

Bottom line: Your website’s speed can be the difference between generating revenue and not generating revenue.

You should not stop monitoring a website’s performance. Monitoring your site should be part of your daily web design workflow. Check home page load time, checkout process load time, and conversion rates at regular intervals.

Web Performance Makes a Difference in Sales

Don’t think a couple of seconds can make a difference? Think again. According to Jupiter Research (which has since been acquired by Forrester), the average online shopper in 2006 expected a web page to load in four seconds. Today, those same shoppers expect web pages to load in two. Poor web performance is one of the biggest reasons people are dissatisfied when shopping online. People who experience performance issues will abandon a site or switch to a competitor. Because page load time is important to web browsers, even Google GOOG +0.1% has begun factoring site speed into their algorithm when ranking websites.

Responsive Web Design Also Affects Web Performance 

Website visitors expect the same type of experience on their mobile device as they do on their computers.

So now, not only do you have to think about how a website performs in various computer web browsers, you also have to think about optimizing a website for the many types of mobile devices. This is where responsive web design comes into play. Responsive web design (RWD) involves creating a site that  adjusts depending on what type of device is doing the viewing. The text can be scaled down, to offer only the main text and images.

It’s important to note, though, that just because a site has responsive web design and looks good on a certain device, it does not necessarily mean that it will load faster. And just because it loads faster on a mobile device, does not mean a visitor or customer will stay on it longer.

Three Ways to Optimize Your Website

There are a few things you can do to make sure your site is performing at optimum speed. First, you can run a web page analyzer to help you see what is actually being loaded and what is taking the most time — and clean up any problematic HTML, CSS and Javascript code. Next , here are three best practices to consider:

  1. Get a dedicated server. One way to improve performance is to move to a faster server or get a dedicated server. Although it may cost more, being on a slow server can cost you even more in sales long term.
  2. Use a CDN. If your site has large amounts of content to display, consider using a content delivery network (CDN) — a company that employs a large system of servers placed in various locations to deliver web pages to visitors. Most CDNs are used to host static resources such as images, videos, audio clips, CSS files and JavaScript. The closer a CDN server is to a site visitor, the faster the content will be delivered to the visitor’s computer or mobile device. CDNs help improve global availability and reduce bandwidth. However, the main issue a CDN addresses is latency, or the amount of time it takes to deliver website pages to the visitor.
  3. Compress images and text. Another way to improve website performance and speed up page load times is to compress images and text. A server does not have to send out as much data this way. Some hosting providers automatically compress websites, and there are a number of tools you can use to test whether or not it is compressed. Most sites are image heavy, so if you want to optimize an image without losing visual quality, you can use a tool like Yahoo YHOO +0.17%’s Smush.it. For web graphics, use GIFs or PNGs rather than JPGs.

Just like a physical store needs organizing, websites need cleaning. When it comes to page-load optimization, every kilobyte counts. Web performance is a critical part of a customer’s experience. Don’t put optimizing a website’s loading time on the back burner, as it can be detrimental to loyal readers or repeat business.

For over a decade, Mike Quinn has been active in website design and development. After completing formal training in multimedia technologies in 2002, he became a founding member of a Pittsburgh based design company, Yellow Bridge Interactive (YBI). YBI’s focus is creating SEO-friendly websites that utilize the latest design and programming techniques.

The Young Entrepreneur Council (YEC) is an invite-only organization comprised of the world’s most promising young entrepreneurs. In partnership with Citi, YEC recently launched StartupCollective, a free virtual mentorship program that helps millions of entrepreneurs start and grow businesses.

Ways to Increase Your Site’s Traffic

If you have a product you’re really proud of, it should speak for itself. But when it comes down to it, you still need to get customers on your website in the first place — especially if you’re running an e-commerce operation.

We asked a panel of 13 successful entrepreneurs to share their best advice for generating high-quality, organic search traffic to their business websites. Here’s what they had to say:

1. Focus on the Long Tail

lawrence watkinsIf you are a new site, it can be difficult to go after popular keywords right away. I find it better to write many quality articles on very specific keywords than to go after the ones with more search traffic. A great benefit of staying focused with long-tail key terms is that they usually convert better, as well. To help with this, I recommend a tool called HitTail, which drives targeted search visitors to your website by focusing on the most promising organic keywords in your existing traffic.

Lawrence Watkins, Great Black Speakers

 

2. Stick Around

Alexandra LevitThe longer you are in business and producing quality online content, the more likely you are to pop up in search results for all related keywords. Starting a blog or churning out a bunch of articles is all fine and good, but keeping those activities going for years as opposed to months (or weeks) makes a huge difference.

Alexandra Levit, Inspiration at Work

 

3. Optimize Your Articles

Nathalie LussierThere are three main ingredients to a successfully optimized web page or article: your meta title, description and keywords. This is such a simple thing to fill out when you’re publishing a piece of content on your site, so take the time to do it each time, and you’ll start to rank for your keywords much faster.

Nathalie Lussier, The Website Checkup Tool

 

4. Don’t Forget About (Ethical) Link Building

Christopher KellyKeyword-embedded links are the foundation of off-page search engine optimization. The best part is that links can be free. Just ask vendors, partners, press, clients, your alma mater and any other credible source that you interact with to embed hyperlinked keywords back to your site for the terms that you are targeting. If the referring source has a high page rank, you should see a pop in your rankings in less than two months of them being published.

Christopher Kelly, Convene

 

5. Use Google’s Keyword Tool

patrick curtisUse Google’s Keyword Tool to find long-tail keywords that are not as competitive, then structure some content around those. If you are in a competitive niche, this is a way you can start building up some small recurring traffic and engage your users.

Patrick Curtis, WallStreetOasis.com

 

 

6. Provide Amazing Value to Your Readers

Liam MartinWhen it comes to increasing organic search, content marketing through blogging or guest posts is the fastest way to build great traffic. However, content marketing is a quality game and not a quantity game. If you have horrible content, people won’t bother reading it or sharing it, which is basically the entire point of building a company blog. Therefore, when I write content, I constantly ask myself if I would take ten minutes out of my day to read it and if I’d share it with others. If you wouldn’t do either of those things, then you really need to look at your content strategy again.

Liam Martin, Staff.com

 

7. Don’t Try to Outsmart Google

Sarah SchuppGaming Google’s system might work temporarily, but it is not a good strategy for the long haul. To increase organic search traffic, produce top-notch content that’s relevant to what your users might be searching. Check the Google Keyword Tool to make sure you’re using the correct terminology that the general public is using when they’re searching.

Sarah Schupp, UniversityParent

 

8. Think of SEO as an Opportunity to Create Value

Danny WongSEO isn’t a game. At least it’s not a game that you’ll win in the long run if you think of it as a game. Create content that readers find valuable and Google will deem search-worthy. Visitors are more likely to share content that they enjoyed reading and will stay on your site longer, while bloggers and the media might use your site as a reference, which means more organic links.

Danny Wong, Blank Label

 

9. Decrease Bounce Rate

adam liebIf there is one thing search engines hate, it is a high bounce rate. Check your keywords for this, and optimize those pages to reduce your bounce rate. Search engines will love you for it.

Adam Lieb, Duxter

 

 

10. Produce Quality Content

John HallSearch engines are rewarding people and companies who are getting high-quality, consistent content coming from them. Things such as author rank are going to have a big effect on organic search results. Put a plan in place to not only create content to publish online, but also to be able to maximize the value of the content so that it is properly distributed across social channels and has a chance to go viral.

John Hall, Influence & Co.

 

11. Create a Company Blog to Increase SEO Traffic

Jay WuSEO is king in organic search traffic. The more popular search terms within your niche that you include on your website, the more searches will organically lead to your site. But including too much text on the main pages of your site can do more harm than good, which makes it difficult for consumers to find the information they want. Instead of overwriting the copy on your homepage, about page and product pages, start a separate blog for additional SEO work. Use the blog to write about your niche, whether it’s construction, beauty or entertainment. Try to do keyword research to find out which phrases are trending in your industry and include them in the blog posts. As long as your blog has a highly visible link back to the main page of the website, the blog will increase your visibility.

Jay Wu, A Forever Recovery

12. Leverage Industry Experts

Chuck CohnEveryone likes opining as an expert. You’ll be surprised how easily you can convince industry leaders to contribute guest posts to your own blog. They will likely have their own readerships, and those people will become familiar with your brand. The experts are also likely to produce great written content that will be of great interest to your existing users.

Chuck Cohn, Varsity Tutors

 

13. Create a Community

Mitch GordonIncreasing Google traffic is all about answering questions your community finds important. You need to become the authority in your niche. Have your community ask you questions, and you’ll be well on your way to providing high-quality, valuable and useful content. That’s what Google cares about. When you provide answers to your community’s questions, Google will rank your site well for many keyword terms you wouldn’t have been able to think of on your own. You create loyalty in your community and rank well in Google at the same time. That’s a win-win.

Mitch Gordon, Go Overseas

Image: Robert Scoble

 

Source: http://mashable.com