Amazon Associates Geniuslink Optimization Geo-Fragmentation

Link Localization Testing: Amazon OneLink vs. Genius Link

Product matching has become a necessity for affiliate marketers around the globe who are looking to capture international commissions. Both the Genius Link service and Amazon’s OneLink specialize in product matching across the massive Amazon catalog to close the international commission gap. Upon receiving inquiries of the differences between the two link localization services, the following are the results of some testing that the Geniuslink Team conducted to compare the Genius Link Service and Amazon’s OneLink.

We initially tested the OneLink service when it was introduced in early July 2017.  You can see the results of that testing in our Comparing Genius Link and Amazon OneLink blog.

In early November of 2017, Amazon expanded OneLink to cover Europe and Japan. At this point, we did a significantly larger test to compare the accuracy of the two services. The following is a detailed explanation of the testing we did.  Please contact us with any questions or if you’d like to replicate our testing.

Set Up

To be as thorough as possible we chose six diverse but commonly used categories, which included Kitchen & Dining, Video Games, eBooks, Electronics, Baby, and Camera & Photo.

We then went to the Best Sellers page and grabbed the links for the top 20 products for each of these categories.  

For each of these links, we then created the Genius Link equivalent of the link and then the OneLink version. Next, we spoofed our IPs (shout out to Wonderproxy!) and tested how each of these links performed for clicks originating from the seven countries (Canada, United Kingdom, Germany, France, Italy, Spain and Japan) that OneLink currently supports.

We used a private/incognito browser window for all testing to ensure that any sort of cookie-ing that Amazon did would not skew results.   

Across the six categories and seven countries, we ultimately started with the goal of testing 840 links for each service. However, early on in our testing, we noticed that the initial run through of clicks on OneLink links was pretty bad but after some time they would improve.  As a result of all of the OneLink tests, we actually tested multiple times, often days apart to ensure we were getting Amazon’s best result.  We ultimately went through 1,280 links per service (that is 2,560 total links!) to ensure we were getting the best results from OneLink.  Our testing then used the latest (most accurate) version of the results.   

Due to the massive number of links and having to test the OneLink links at least twice our test ran over a two-week span (11/2/2017 through 11/14/2017) and ultimately had five different people working on it (to reduce any unconscious bias and burnout!).  

Each link’s translation was then classified in one of the five buckets:

Product Match – The same product was matched internationally! This is the ideal result and is the most seamless experience for the consumer.

Product in Search – The translation resulted in going to a search results page, but the matching product was in the first page of results. While not perfect, this is also a good result of the link translation.

Return to – No translation was actually made and the international click was redirected back to While not ideal this is the same behavior a shopper would experience as if the service didn’t exist.  

No Product In Search – Similar to the “Product in Search” results, the translation resulted in search results page, however, the product wasn’t anywhere on the first page of results. This is ultimately a negative user experience.

Wrong Product – When the link translation redirects to the wrong product this is the worst user experience.

From these results, it was found that Geniuslink’s translation resulted in going to the correct item in the foreign storefront 50 times more often (446 times vs. 396 times) and went to the wrong product 10X less often (just five times in 840 tests) whereas, OneLink directed to the wrong product 56 times.

Genius Link


Product Match: 



Product in Search:



Return to



No product in search:



Wrong Product:







Automagic Link Translation Score

While looking at the total perfect and worst case scenarios helps show the difference in the two services we thought this didn’t’ tell the full story.  To further quantify the quality of localization result between the two services, and include the results from the three scenarios in between best and worst, a score was applied to every result in the following way:

Product Match: 1 point (Ideal user experience)

Product in Search: ½ point (Good user experience)

Return to 0 points (Neutral user experience)

Product not in Search: -½ point (Negative user experience)

Wrong Product: -1 point (Bad user experience)

Applying these scores to the results above gave the following:

Genius Link



Product Match:

446 396

Product in Search:



Return to



No product in search:



Wrong Product: -5






Geniuslink ended up with a total score of 409 points while OneLink finished with 326.5 points.

Taking the total score and dividing by the total possible score of 840 (getting a perfect product match every single time) puts Geniuslink at a 49% and OneLink at a 39%, making a difference of 25% between the two services.

While the results of this testing show that Genius Link’s localization performed objectively better than Amazon’s OneLink, resulting in more accurate product matching and ultimately a more seamless customer experience – however, neither service is standing still (we continue to tweak with our translation algorithms and expect Amazon to do the same). Further, both offer significant improvements for affiliate publishers with an international audience and outpace all of the other link localization services available.  

It’s worth noting that in this comparison we are purely focusing on link translation, but you can read more about what else differentiates the two services.

This testing was conducted by the Geniuslink Team in an attempt to provide an unbiased comparison of the differences between the two services, but we want to encourage everyone to do their own testing to evaluate the services for themselves and validate our results. We look forward to your questions and comments and welcome anyone who wishes to dive deeper into this!