Have you just published your new website and do not know if you have SEO well? Well, if you have just hired the website design without further trouble, you may face a series of problems with Google that you do not want.
When we publish a new web page we are eager to show it to everyone, and we want people to see it or even give their opinion, but on the other hand, what we least want is that “Google doesn’t like it “.
I would almost venture to say that search engines are ” tired” of crawling newly created web pages with lousy SEO and many indexing issues.
You have to think that a search engine like Google is a machine that searches for files through its spiders on servers, then categorizes them, and stores them so that they can be delivered to users if they coincide with a search.
But what happens if these spiders do not interpret a web page well?
New websites not supervised by an SEO expert , may face the big problem that they fail to attract quality traffic to them.
And this means: losing money for the company or businessman.
We see a small example:
So, What SEO points do you need to check so that your new website is not thrown by Google down a cliff and no one finds it?
All you need to do is follow SEO key points that we have mentioned here after conducting a detailed survey on the seo forum to ensure that everything gets to its right order.
#1. Check the titles of all your pages
It is not surprising that many new web pages (blogs or online stores) generate duplicate title ( H1 ) problems because they are very similar to each other.
Make sure each page title is unique and also points to a specific keyword.
Another common mistake on new web pages at the SEO level is to use the same keyword on different pages. Check your keywords to avoid falling into cannibalizations.
But if you are still unaware of the details then do contact SEO Expert in your area.
Although it weighs less for search engines, page descriptions are still useful, so as with titles, each page should have its own unique description.
#2. Having the correct Robots.txt file
The robots file is more important than it seems, in a simplified way we will say that it serves to block Google from the pages that we do not want it to crawl because they are not important or because they can generate duplicate content among others.
An example of a robots file for a new online store in wordpress can be the following:
User-agent: *
Disallow: / wp-admin /
User-Agent: Googlebot
Allow: /*.css$
Allow: /*.js$
Allow: /wp-admin/admin-ajax.php
Disallow: / author /
Disallow: / tag /
Disallow: /wp-login.php
Disallow: / * / feed /
Disallow: * / * feed /
Disallow: * / comments /
Disallow: /xmlrpc.php
Disallow: / *? S =
Disallow: * / trackback
Sitemap: https://domain.com/sitemap.xml
An SEO point of interest here is to include the sitemap line to help discover urls to search engine spiders.
#3. Sitemap file created and submitted to Google
The sitemaps are files that contain a list of all URLs that make up a website . As a rule it is usually an XML format .
I have seen many, many times as new web pages don’t even have sitemap. Although it is not mandatory to have it, it is recommended , therefore, it’s better to have it.
If you want to optimize the crawling of your new website pages, it is important to generate this sitemap and send it to google through the search console tool.
If you are facing any problem creating or understanding a sitemap you can contact an SEO company or SEO expert near you. You can find out by writing “SEO service in dubai” on google to find appropriate results based on location.
The webmasters of the website must create a property in the Google tool and through this the sitemap must be sent to the search engine to indicate which are the most important urls of the website.
If you want to make sure that Google is going to correctly track your new website, submitting a sitemap is possibly the best option.
Bing
Much is said about Google and very little about other search engines like Bing. My advice is to register the web in Bing webmasters Tools and check the property to optimize this search engine as well.
A possible 20% of search engine traffic should not be wasted .
#4. The website must have internal links
In SEO there is a thing called “orphan pages” although I prefer to use the term ” zombie pages “, and they are those that are not linked from any other url on the web.
A system that is widely used in SEO to position online stores is to link from blog articles to store products through the creation of internal links.
Google representatives like John Mueller or Gary illyes have stated on more than one occasion that internal links are very important both for improving web crawling and for ranking in organic results.
Anchor texts of the links help Google to define the theme of a page in the broadest sense. Use them to improve the SEO quality of your website.
#5. Slow web for large or heavy images
Some new web pages have been designed without taking into account the size of the images or worse, their weight, making the page slower than my grandfather’s mule.
Conclusion: google doesn’t like slow web pages = worse organic ranking.
If you want official agreement on the problem of a slow website, I leave you here what Google’s own John Mueller comments on this point:
John recommends that the average response time be around 100ms. Otherwise, Google won’t crawl as many pages as it would have.
Enter the images in the appropriate dimensions and compress the images before uploading them to the web. This will already save you from having a poor Google score.
My advice is to especially check the score given by the web in Page Speed in the mobile version and from there implement actions to achieve an acceptable score, say 50.
Here are a couple of articles that can help you improve this critical point:
How to improve the loading speed of the web
How to speed up a web speed by 25%
# 6. Check errors in the product pages
If it is a new online store it is necessary to verify that there are no major errors on the product pages.
This point is verified within the google search console. Structured data is one of the strongest points of SEO today and having the correct tags implemented on product pages can mean a significant improvement in organic rankings .
Structured data is the language that search engines understand
This article on how to perform data marking in an online store can help you.
Here’s an example of a single product page in JSON-LD :
If you have created a new online store, you must ensure that the product pages implement at least the mandatory schema.org tags .
# 7. The problem of migrating from HTTP to HTTPS wrong
If for example you have a redesigned website that has also gone from having HTTP to HTTPS, then you can have serious problems with organic ranking in Google.
Many are the web pages that have had a drop in their organic traffic when going from HTTP to HTTPS for not doing the corresponding correct 301 redirects.
Google’s own public officials comment on the following about migrations to HTTPS:
If you migrated from HTTP to HTTPS and didn’t redirect all HTTP addresses to HTTPS with clear 301 redirects, or if you deleted many pages or blocked bots using robots.txt, you should expect further fluctuations in your site’s ranking.
As a general rule, this means losing a large number of visits. This article on how to migrate a website so as not to lose SEO can help you.
# 8. Have urls with #
Sometimes when designing new web pages, developers generate anchors (#) in urls to take the user from one site on the page to another specific one.
This in terms of usability or user experience can be useful in certain cases, however, it should be borne in mind that when there is a hash (#) in a URL, such as https://www.dominio.com/tienda.html #other, Google will not index it .
It is possible that in custom web developments sometimes the urls may appear as https://dominio.com/ # categoria23 . Keep in mind that Google does not want it.
# 9. The organization of content in Themes or Cluster
One of the technical SEO points that I can advise when publishing a new website is to try to group the content into ” clusters ” or topics.
seo cluster architecture If Google doesn’t have an efficient way to crawl and index your content, it won’t be able to establish topical connections between all related content, for this reason it makes sense to create topic clusters .
Definitely. If you create themes, the content of the web should have a better classification the more internal links you have to the main pages of each theme.
Conclusion
If you comply with this SEO checklist, you already make sure that your newly designed website will avoid many problems with google (or another search engine), and also, it will surely get a better exit score .