- Product focused – know what they want and they want it fast;
- Browsers – have time to browse around and need to see what’s new;
- Researchers – need easy way to look up and compare product information;
- Bargain hunters – look for a great sale; and
- One-time shoppers – may also be one of the other four, and hate creating and registering an account with you.
Come 21st of April, Google will roll out its new “Mobile Friendly” algorithm update which will preference search results for web sites that are mobile friendly.
For your websites, this simply means you’ll get left out in mobile search results unless your website is deemed by Google bots to be mobile friendly.
How should you know if my site is ready for mobilegeddon? Fortunately Google, being Google, has already foreseen the outcry of website owners if they opted to bring their algorithm guessing game to such an important update so they’ve actually rolled out more than enough tools to help you prepare for this big day.
Without further delay, here are the tools and information you’ll need to be able to do a self-diagnosis of your site in preparation for mobilegeddon:
- Mobile-Friendly Test – just simply put in your website URL and hit analyze and you’ll know within seconds if your site is up to speed. Hopefully you’ll get a result like so:
- Google Webmaster Tools Mobile Usability Report – This is another tool that will help webmasters identify elements of your website that does not fit Google’s mobile friendly standards, because it could be that some NOT ALL your pages have problems. Errors here should be addressed if you want to keep up on mobile search results.Here’s an example result for good measure:
- Mobile Friendly Guidelines – In the case you’ll find yourself in the undesirable side of this update, after using the previously mentioned tools, fret not as here’s all you need to be able to get back in the good light of Google mobile search results.
Remember, this is not just about penalties but also about rewards. A more mobile friendly web site will be rewarded as much as a non-mobile site is penalised.
And as always, if you need help in keeping up with all these changes, don’t hesitate to get in touch with us any time.
- A subject that’s a real-world entity
- A description of some characteristic of said entity
- An object that shows the value of said characteristic
In an effort to promote a more secure web and better reflect relevant search results, Google recently announced that it would now take into consideration website encryption, also known as HTTPS, when ranking sites on their search engine results pages (SERPs).
It’s a move that should wake up web developers who have procrastinated over their implementation of security measures, or site owners who may have wondered if their sites were “important enough” to need encryption.
According to Google, HTTPS will initially be a minor search rank signal, affecting less than 1 percent of all queries around the world. So, that means it won’t immediately have as immediate an impact as other ranking factors—like the quality of content in a web page for instance—as Google wants to give webmasters enough time to switch over to HTTPS.
Still, that doesn’t mean you can drag your feet with your site’s security, as encryption is bound to have a major effect on search ranking, what with Google being a staunch advocate of website security. It’s best to start as early as possible, and with the potential bonus of higher search rankings, there’s no better time than now.
To facilitate the switch to a more secure web, the company is looking to publish a series of guidelines around HTTPS, helping website developers better understand what needs to be done in properly encrypting their websites, as well as how to avoid common mistakes. Google adds that these tips will include best practices on things ranging from the type of certification needed, the proper use of relative URLs for resources under the same secure domain, allowing site indexing, and many more.
Furthermore, Google recommends website developers to test their HTTPS-encrypted websites through the Qualys Lab tool, while further questions on encryption and its relation to search ranking can be sent to Google’s Webmaster Help Forums where the company actively interacts with a larger community of site owners and developers.
Like most Google announcements involving its search ranking algorithms, it has drawn plenty of feedback from website owners and developers, as well as those in the SEO industry. Google’s blog post on the subject has more than 1,500 comments as of this writing. Reactions are mostly in favour of the change, with many seemingly expecting that such a development was going to happen sooner or later. One commenter opined, “So, it’s not often that you’ll get SEO tips directly from Google — but here’s one that I’m proud to be associated with: HTTPS is now being used as a ranking signal.”
Google’s announcement is consistent with its efforts to better secure its own traffic, which included encrypting traffic between its servers. Gmail now uses an encrypted HTTPS connection by default, preventing mail from being snooped when moving between users and Google’s servers.
In a time when paranoia over government cyber spying is at a frenzy, tech companies are scrambling to beef up their own security measures. In November last year, Yahoo! also announced plans to encrypt its own data centre traffic.
At Enform, we’ve long seen encryption and HTTPS as fundamental measures for improving a site’s security, no matter how small it may be. This time around, Google’s efforts only provide another incentive for webmasters to make the switch.
E-commerce solutions like PARts B2 provide detailed product descriptions and details from the supplier using a database
The basic tenet of e-commerce: Help the customer find your product and get what they want. If a potential customer can’t find your product, you obviously won’t get a sale.
However, connecting with relevant product pages is just the initial phase of the purchase process. And while it’s true that many sites have made improvements to their navigation and information architectures, many product pages on e-commerce sites are still in need crucial improvements.
Enform’s clients already know that product pages should do more than just have a product image, a generic description, and an option to add to the cart. Instead, the page should sell the product, convincing users that the product on the page is exactly what they’re looking for.
Yet as simple as that sounds, many pages fail to do this.
Product pages are especially important since they fill the gap of the traditional shopping experience, where users are normally able to touch the product, examine its packaging, and test or fit it before the purchase. Online, users can only go by what they see on the product production.
Multiple e-commerce studies by web usability experts the Nielsen Norman Group (NN/g) show that as much 20% of all observed task failures, or times when users abandoned or failed to make a purchase, were caused by poorly written or incomplete product information.
NN/g recommends the following tips for website product pages.
- Pages Should Answer Customers’ Questions
NN/g’s research specifically indicates that many users simply couldn’t find enough information to make an informed purchase decision. Now, there’s no way to guarantee that your product pages will answer all questions by potential customers, but that doesn’t mean you should settle for the bare minimum either.
The J. Peterman Company is a company known for using lengthy, verbose stories for product descriptions, in their print catalogs as well as online. They also follow their more eloquent prose with standard facts about the item for sale, such as “pointed collar,” “shell buttons at center front,” “1-inch grosgrain ribbon (antique white) at neckline and left front placket,” and “adjustable cuffs.”
Besides the most obvious features of the product, shoppers also want to know the smaller details on products they’re eyeing, and that can be anything from accents on clothes; furniture dimensions; product care information; size of toys; storage recommendations for edibles, to whether or not a hotel has a heated outdoor pool working all year.
Where many sites get it wrong is in their focus on basic information, or sometimes even the wrong information.
- Go Straight to the Point
Just because we told you not to settle for basic information, doesn’t mean you should input long-winded descriptions of your products. There’s a difference between a complete product description, and a wordy one. Users want information that describes the product, not incessant please to buy. One or two calls to action will suffice, don’t go too overboard with the marketing messages.
Forever21’s brief description covered key details about the product, its construction, and how a customer could wear the item. This was followed by a bulleted list of product details, including fabric, measurements and care which is quite a good example of going straight to the point.
Users often skim through text when browsing and reading online, and are more likely to read at the beginning of the text than the end. Given the importance of the first few lines of your product description, don’t waste it on text that doesn’t help the user.
Another great way of conveying the specifics of a product is to use product photos. NN/g’s found out that large and detailed images are a tremendous help to users wanting to know more about a product. Unfortunately, many sites settle for small images that fail to show sufficient product details.
- Make Comparisons Easy
Several online shoppers view the ability to compare multiple products as a crucial factor in shaping their purchase decisions. It’s imperative that you offer a facility to help users decide which of several products is best for them in a smooth and easy manner.
Pottery Barn listed information about dressers in a consistent and descriptive way. Two bedside-tables descriptions began with brief overviews, and then bulleted lists that provided comparable details about the products, listed in the same order for each. Each listed dimensions, followed by materials, features, finish information, and hardware details.
It also helps if you can reduce the need for comparisons by making your product line simple if your catalogue allows for it. For those that can’t, such as e-commerce sites that carry multiple vendors, some help with tools is needed.
Many e-commerce sites already have tools that enable shoppers to compare products side by side. Some of these are effective, others not so much. According to NN/g, the key here is to offer comparable information in an easy to compare manner between similar products. It also pays to be consistent in the volume of information featured for every product; customers don’t like seeing plenty of information on one product, and hardly any on another.
Overall, remember that many customers are actually looking for a reason or confirmation to buy your product or transact, try not to disappoint.
Making Effective Use of Analytics to Enhance the User Experience
For a long time, the primary use of analytics systems was to help marketers develop marketing strategies, providing important statistics on everything from page views, clickthrough rates, impressions, and more. However, we now see this source of quantitative data helping user experience (UX) professionals improve the usability design of websites.
The value of analytics is sometimes lost to us, having a tendency of becoming a black hole of data, which although useful, provides no real value to the uninformed webmaster. At the very least, this useful data is left untouched and subscribing to the analytics service becomes a waste of money. At the worst, analytics systems lead to costly expenditures in areas not as productive as others are.
Analytics systems can become a distraction, turning from something that helps you in your work to something you need to spend time understanding and getting to work. For UX professionals, it’s important to take one step back and think of how analytics data can help in enhancing current usability techniques.
Web usability experts the Nielsen Norman Group surveyed several UX teams to find out how they used their analytics data, coming up with some interesting findings.
1. Issue Indication
NN/g found that a number of UX teams collaborated with optimisation experts while designing a site and launching new features, creating what they referred to as a measurement plan. UX teams receive regular reports to keep track of the site’s ability to meet usability goals, turning to the analytics system to diagnose issues.
A measurement plan normally consists of:
- Goals, or macro conversions, which refer to large-scale actions that users perform on the site for it to convert to success. Think purchase completions or lead submissions.
- Desirable actions, or micro conversions, which as the name suggests, pertain to smaller actions that often lead to a larger goal. This can be something from visiting page, click on a link, or keying in user data on a form.
- Metrics, or web analytics data, show quantitative data on the frequency/number of these actions.
This mode sees UX teams formulating hypotheses on issues around macro conversions, using analytics to prove or disprove their theories. The investigation tackles issues categorised into: traffic, technical issues, content, visual design, and navigation.
NNG provides some examples below.
If you want to find out which traffic source is the cause of a drop in page visitors, use Google Analytics’ Pages and Source as a Secondary Dimension to get traffic data. Google allows you to get specific reports on a web page’s sources, be it search engines like Google, Yahoo!, or Bing; email campaigns, or direct.
b. Technical Problems
UX teams can investigate issues like pages failing to load properly by looking at Event Pages, which provides a report on all pages being tracked. It’s as simple as choosing a specific page to get your metrics on events and anomalies, if any.
c. Content and
With Google Analytics, UX teams can also find out if certain keywords and phrases fail to encourage web users to perform specific actions. It also allows professionals to determine if certain typography, images, and colors impede the success of calls to action. In-Page Analytics is the tool to use for both issues.
If a UX professional wants to look into ineffective site links and buttons, Google Analytics Pages, filtered by the page URL and choosing the navigation summary tab provides details on in-links, or which specific pages users came from before going to a page of interest, and where they went afterwards.
This mode sees UX teams pinpointing a site issue by combining quantitative data with the qualitative information gleaned from usability testing to find more clues and figure out a solution to the problem. Usability tests are not perfect, especially tests that are quick and involve a limited number of users. It’s here where analytics reports can come in, monitoring potential problematic spots usability tests may have gotten wrong.
NN/g provides a scenario where participants in a test can’t find information about a certain topic because the keyword used on the site is different from the one they’re searching for.
To determine if people are actually using these keywords the participants in the study used, UX teams can look at Google Analytics’ Search Terms, which provides lists on the words and phrases users key into the website’s search box.
At Enform, we believe that the quantitative data analytics systems provide is a crucial component to developing a sound user experience on a website. However, this also means that usability and UX specialists have to become familiar with these systems, as well as the information they offer, which for a while, has been the domain of optimisation experts. The fact that analytics allows us to catch issues early on before affecting conversions, and helps in investigating suspicions on usability problems, is enough reason to learn these systems, even if they’re built for marketing, not usability.
Google Hummingbird Update Topples Current SEO Strategies
For those of you busy with your online marketing campaigns, it’s likely you missed Google’s announcement of their latest update to their volatile search engine algorithm. And not surprisingly, the update has been with some backlash from SEOs to say the least.
A few hours before their 15th anniversary, the tech giant unveiled Hummingbird, arguably the company’s most sweeping search update in the last 13 years. Like its recent Panda updates, Hummingbird was apparently designed to weed out content created for the purpose of SEO, and not so much providing better and higher quality search results for humans and users.
How Does Hummingbird Work?
Hummingbird works by dispensing with traditional keyword searches and using searches of a more conversational nature to provide search results that are more “aligned” with what users are searching for.
Google says the update places stock in providing answers to users instead of just results. While this may not seem like a big deal, many experts now see phrases like “how to” and “how do I,” which used to have no real value, as now beingimportant.
We agree with many search experts in the industry who see this update as a sign of Google’s desire to keep results in tune with what users search more for on the Web. Conversational search has been a growing driver of a specific type of search users search for—Hummingbird is simply a shift to accommodate these searches.
What this means is that Hummingbird will accommodate longer searches, shifting from traditional single keyword searches like, “iPhone 5s” to longer phrases such as, “cheapest iPhone 5s in (insert area or city),” focusing more on intent instead of just keywords.
How Does Hummingbird Affect your Business?
Google’s algorithm updates usually have immediate results on the campaigns of marketers and webmasters, but Hummingbird seems to be different. According to Danny Sullivan of Search Engine Land, they’ve yet to hear of sweeping losses in traffic after the algorithm change. Google’s Panda update in 2011 for example, which focused on removing “low-quality sites,” decimated the traffic of several thousand sites on the Internet. In contrast, Hummingbird’s been pretty quiet.
Another update that’s likely to throw a spanner in the campaigns of many is Google’s move to encrypt keyword data and hiding it from site owners and marketers. Google has finally switched all searches to encrypted searches with the use of HTTPS, with no data passed on to site owners. This means no tracking and lumping of users together with their keyword searches. We’ve seen a rise in “(not provide)” in our analytics data since Panda rolled out, and now we can expect to see this figure grow.
Enform will be providing more guidance on this aspect in future updates.
The Bottom Line
Google’s Hummingbird algorithm is delivering dramatic changes in the SEO landscape but site owners and marketers should embrace the change as it is biased toward good quality marketing practice optimised for human search.
The key is, as always, to adapt to algorithm changes and understand what your users are searching for. Hummingbird simply cements the importance of providing genuine and high quality content that answers questions. Rethink how your customers use the Internet to search for your products and services, and make adjustments accordingly to your findings.
Enform has always stressed the importance of proper SEO research and audits to make sure that their customers are offering answers to their customers search questions. Hummingbird makes this even more critical.
When one needs to know more about a particular product, company or service, mostinternet users merely “Google” it and the first page of the search results come on display….but do they really get the most appropriate information? With mostly “search-dependent” users in the internet, there is not much accurate data and most users are not even able to decipher even halfway complex concerns with search. This goes on to say that most users depend on the power of SEARCH, but do not exactly know how to use it effectively, hence, they are not able to take hold of what they really need in the first place. In fact, search practices have just gotten bad to worse over time, coming close to descriptions like “pathetic”, “useless” and incompetent.
Ironically, just when the internet has been the “know-it-all” resource, still there is not much accuracy and precision on the actual information gathered due to search idiosyncrasies.
According to the Nielsen Norman Group, a firm that focuses in computer user interface development and user experience recently released a study on the growing internet search discrepancies among internet users. The accuracy of data is not achieved because of incompetent search skills. Not finding what is being specifically searched after several tries, is perhaps the most frustrating user experience.
The Simpler the Search, the Better– In the study, it further revealed, searchers depend on search, but they do not know how to use it. People these days have been bombarded with new terminologies leading to queries that do not get them anywhere. A well-designed search facility, a simpler and specific search method and more improved users’ search capabilities can pretty much make a difference. This was revealed in a test conducted on e-commerce sites showing an overall success rate with search at 74%, a more acceptable figure than the first attempt resulting to 64%. It further showed the level of difficulty on the search problems could greatly affect the success of the search. In essence, the probability of the search success drops from 64% to as low as 28% varying from simple and easy searches to the more complex ones.
Responsible Website Designing For Great Searches– Apart from the users’ reliance on their searching skills, search suggestions in the internet could prove helpful but could also be limiting. Most often, when some terms are not included in the search suggestions, users may not take time out to search for it even when it can provide otherwise. Businesses should be able to take upon themselves to present their website in a manner that it could guarantee a search-friendly approach, one that is effective, specific and accurate. If users are able to get what they asked for, those are exactly what they want, then they are happy and satisfied with the results of the search. In the study on Costco’s website, one user tried to search for a TV set and quickly searched for “television”. The search results page showed a specific category page for TV sets which highly include relevant features, brands, resolution, screen size and facets like plasma and LCD.
As far as the Costco search study was concerned, users were happy and satisfied – they got what they asked for in their search! Enform believes that only when users start having a good grasp on what effective searching really is and when businesses also improve their site’s interface, only then, can there be improved and successful online enterprise.
Regardless of the increasing sophistication of using search with the help of search engines, many irrelevant web pages are still produced. Along with all the other search engines, the typical approach to search still requires accuracy, skills and improved user interface.
“The ultimate search engine would basically understand everything in the world, and it would always give you the right thing. And we’re a long, long ways from that.”
– Larry Page
Follow the link for more information on SEO and Google Adwords.
We’re delighted to be able to congratulate our team member Debbie Dankworth on becoming individually qualified under the Google Adwords Certification Program, specialising in Search Advertising.
Debbie is a hands on marketing professional with over 20 years experience and her Adwords Certification adds a significant tool to her set of skills as a digital and online marketing expert. Enform are very proud to have Debbie as an associate and we look forward to adding more value to our clients SEO and SEM needs.
Google Adwords Qualified individuals are globally recognised and have had to sit a number of exams to achieve the qualification demonstrating a close understanding of Adwords, campaign deployment, management and measurement. Qualified partners and individuals are also listed in the Google Partner directory.
Enform believes that Search Engine Marketing (SEM) is becoming increasingly important as the Search Engine Optimisation (SEO) space becomes more and more crowded. Native search through good SEO is still critical for long term brand awareness and creation of core brand value but SEM is the tool to help launch product and company idea’s promotions, marketing communication programs and anything that is time critical.
This makes SEM a particularly effective tool to quickly create and deploy campaigns to drive traffic and customers to your site, e-commerce store or social media resource. A simple campaign can be created and deployed in under an hour with results visible almost immediately. Equally, the campaign can be turned on or off in seconds but Google Adwords management console allows that to even happen automatically.
Setting up an Adwords account is free and relatively easy, so why use an agency or Adwords professional? To quote Google themselves;
“Like many advertisers, you may not have a significant amount of time to invest in learning AdWords and managing your own advertising account. Hiring a professional can help save you time while maximizing the return on your investment.”
From our point of view, we see the difference a high quality score can make to the cost and effectiveness of an Adwords campaign. This directly goes to the argument of ROI and there are countless horror stories about peoples blowing their entire budget in one day. Yes, you can do it yourself but wouldn’t you rather be running your business or your marketing strategy?
Google Adwords is still the number one search engine marketing tool and accounts for the bulk of all SEM ad spend globally. That’s why we strongly support the Google Adwords certification program to make sure our clients get the best support in this important area.
Congratulations again Debbie!
When was the last time you ran a full-text search on Google? Did you notice PDF files in your search results? Have you ever wondered how the PDF files on your site could appear on the search engine results pages (SERPs)?
The good news is that Google recognises online PDF files as no different from ordinary web pages, so the search engine will readily index them, provided of course they’re worth ‘crawling’ over.
With PDF files representing a substantial share of page views on the Web, it’s understandable why there’s interest in how PDF documents rank against web pages. At Enform, we’ve received several questions on PDF optimisation for search engines, which is why we’ve compiled a set of brief tips on how to maximise your PDF documents’ effectiveness online.
Text is Best
If you want your PDF files to be indexed by Google, make sure they are mainly composed of text. Google’s spiders are programmed to crawl over text, and while image-heavy documents can still be hosted by Google for a certain period, they won’t be as effective as text.
Text, however, is only one aspect of optimisation.
Work on Good Titles
With PDF files, Google places significant importance in their titles. In the case of PDFs, that blue line of underlined text you see on a search engine result comes from the document’s “Title” information field. Oftentimes, the indexer won’t find any information here, so it proceeds to scan text on the first few pages of the document—something which generates poor results.
A common scenario with PDF files is that the indexer will use any text found on the “Title” field, even if it’s gibberish. This explains the millions of PDFs with nonsensical titles—obviously, that’s not how you’d want to optimise your documents. You want to tell Google what your PDFs are about and what information they contain without “beating- around- the- bush”.
Have Quality Content
While comparing web pages and PDFs is like to comparing apples and oranges, both add to the SEO value of your website if they feature quality content, with keywords strategically placed in headings and tags. Likewise, basic linking of your PDF to your site is one way of optimising the document.
These are just some of the most basic things you can do to optimise your website’s PDF files for search engines.