Blog | Kazma Technology
onpage mistakes

What are some on-page SEO mistakes?

1. Slow website / long loading times:

SEO mistake # 1: long loading times. Nobody likes slow internet sites, not even search engines. If possible, avoid placing code-scaled and uncompressed images and do not overload your website with thousands of unnecessary scripts. If possible, activate the page compression using GZIP or mod_deflate, if your hoster supports these methods. A handy tool for keeping an eye on the loading times of your website is Google PageSpeed – following the analysis, you will receive a very detailed evaluation with tips on how you can improve the loading times. Also, I recommend the study with GT Metrix, a free tool to evaluate and optimize the website speed in detail.

2. Duplicate content / duplicate content:

SEO mistake # 2: Duplicate content or near-duplicate content describes the problem that two or more pages from different URLs have the same or almost the same content. Duplicate content is poison for a good ranking and very often leads to your website being devalued in the search engines or, in the worst case, filtered out. The most common cause is that the page can be reached with and without www and thus provides the same content. You should also never copy content from other sites. In the course of “canonical,” then I recommend from Google Webmaster Support.

3. Too little content / insufficient content:

SEO Mistake # 3: Bad Content is king! A major focus of website optimization is to provide search engines with meaningful and unique content. High-quality content binds the user to your website, increases the length of stay, reduces the bounce rate, and sometimes ensures a significantly better placement in the search results. The content of your page should offer the user clear added value that is worth spreading on the Internet (instructions, entertaining images or videos, checklists, infographics, etc.). Add value to your site by linking to trusted sources, placing pictures or videos, structuring the text well, and avoiding spelling mistakes. The ratio of content (text) to code should be between 25% and 40%.

4. Hidden content / hidden texts:

SEO Mistake # 4: Hidden Content If there is one thing that the panda hates, it is manipulation such as hidden content that is only visible to bots. A basic principle in searc

h engine optimization is: don’t hide anything from Google! So-called hidden content, which is not visible to the user and is only placed to feed search engines with SEO texts, will eventually be punished. A popular method is, for example, massively placed dropdown elements, whose text content is recognized by search engines based on the source code, but is only available to the user when he actively clicks on the element. White text on a white background or invisible text blocks that are hidden using CSS is naughty… an absolute no go!

5. Keyword stuffing:

SEO Mistake # 5: Keyword Stuffing Don’t spam Google and the user with your keywords! With so-called keyword stuffing, search terms in the title, the metadata, and in the content are repeated at an unnaturally high frequency to increase their relevance. In the early days of search engine optimization, this procedure was still common practice to achieve better rankings for a particular search term – now it is viewed as manipulation and punished with an order in search hit nirvana. It is entirely sufficient if the keyword occurs once in the page title and the meta description, and it also makes absolutely no sense if every tenth word in the content consists of your keyword.

6. Bad or inappropriate KeyWords:

SEO Mistake # 6: Bad Keywords (search terms) is the alpha and omega of every website optimization, and you should be clear in advance about what your site should be found for. Choose the keywords wisely and make sure that they fit the topic of your site, service, or offer! Do not only take into account the individual keyword, but also related terms and so-called long-term keywords- So term chains or search phrases that contain the keyword for which your page is to rank. Optimize a single page only for a specific search term and – most notably: research the most frequent search queries or search volumes for your keyword – because a keyword that nobody searches for will bring you absolutely nothing! One of the best free tools for keyword research is Google’s Keyword Planner, but you’ll need an AdWords account to do this. I also recommend Hypersuggest – a potent keyword suggestion tool.

7. Missing or insufficient TITLE tag:

SEO Mistake # 7: Bad Page TitleThe page title <title> does not have a direct influence on the ranking of the page (it is not an essential ranking factor), but it plays a vital role in the display of the page in the search results and is intended to encourage the searcher to visit your website. The better the page title conveys to the searcher what he will find on your page and the more appealing it is (e.g., by using a unique character ), the higher the probability of a click on your exact search hit. Each page should have its page title, which briefly and concisely describes the actual content and contains the main keyword. You can find detailed information about the design of the page title in the article’s optimal page title.

8. Missing or inadequate META description:

The META description is similar to the page title, i.e., the short story that is displayed in the search hit below the link to your website.

Google now often searches for text excerpts from your page that match the search query and displays them in the search hit instead of the description you have predefined, but it doesn’t hurt to define the story yourself.

Here, too, each page should have its own, individual META description, which contains the primary keyword as well as related terms and gives the user a brief and understandable explanation of what he will find on your page.

The META Description should not exceed a maximum length of 160 characters.

If you want to know what your search hit looks like in the SERPS,

9. Missing or defective ALT / TITLE tags for images:

Optimizing your website for search engines means using the options that are not directly visible to the visitor – for example, the alternative texts and descriptions for images and graphics.

The ALT and TITLE tags are ideal for placing relevant terms and information for which your page should rank and also improve the accessibility of your website (the content of your page is read out to blind visitors, so these are meaningful image descriptions instructed, as they cannot see them).

Avoid KeyWord spam and, instead, try to use meaningful words to describe the image in such a way that you know what is on it even without seeing it.

10. Missing or irrelevant headings:

Apple shows how it is: Emotional, meaningful headlines that get to the heart of the topic straight away and make you want more! Use headings to give the main keywords on your page more weight.

When evaluating a page, Google also takes into account the heading hierarchy (H1, H2, H3, …) – use the H1 heading, for example, to place the most crucial term and only use it once! The other headings (H2, H3, H4, etc.) can also be used several times and should also contain the focus keyword or related terms again and again.

Ideally, your most important keywords appear in the page title, meta description, and in headings, so that search engines can determine an apparent reference to the actual content of your page. By the way, “Welcome to my website” is not a meaningful heading.

11. Missing or incorrect canonical tags:

If there are several variants of a page on your website, e.g., through the expansion of search filters, color variants, or the like, then you should define which is the corresponding main page.

The Canonical Tag (alternatively also called

Canonical Link, Canonical URL, URL Canonicalization

or

URL normalization) specifies the unique URL to the content of your page.

It is primarily used to use duplicate content on your page by telling the search engine the preferred or representative URL of the page.

For example, if my blog were accessible with and without www, I could use a Canonical Tag to define which of the two URLs is representative. If you can find out more about the subject rel = “canonical,” then I recommend from Google Webmaster Support.

12. Insufficient site maintenance and neglect:

Merely putting a page online and then not worrying about it isn’t very careful! Feed your page with exciting and, above all, up-to-date content and also maintain the technical structure.

No duplicate page titles or meta descriptions, no dead links, a well-prepared 404 error page, proper 301 redirects, and please, don’t let old pages “scramble” (no longer existent images, links, etc.).

If individual pages no longer exist and you want to remove them entirely from the index, then it is best to give them the status 410 (Gone), because this way, Google knows that the page will never be available again; however, if you want to remove an old page because you have created a new page on its topic.

13.Inadequate internal links:

Try to link your internal pages as well as possible so that the search engine crawlers can optimally index them. Use the opportunity to describe the content of the link briefly and concisely by using the TITLE attribute.

It is best to limit yourself to the connection of pages that are directly related to each other because then it also fits with the relevance of the content.

In any case, avoid wild link spam or links to irrelevant content and avoid mass links in the footer, as these are of little relevance in terms of SEO anyway.

14. The too high number of outgoing links / Nofollow links:

Try to limit external (outgoing) links as much as possible. Sources and references to trustworthy sites, of course, always a hit, but also let the link juice (link juice) flow out of your page. And especially when your site is fragile, the number of external links should be kept as low as possible. To a limited extent, you also have the option of marking links with Nofollow, but this should only be used with care, as it disrupts the flow of links in search engines.

Also, check regularly whether the pages to which you are linking still exist – I recommend Xenu’s Link Sleuth for this, with the help of which you can contain all internal as well as external links.

15. Missing Robots.txt

With the help of robots.txt, you tell the search engine crawlers which pages can and should not be indexed. This file is not an absolute must, but it should be made available if possible.

Using the robots.txt, you tell the search engines whether there are individual pages or directories on your website that should not be indexed (e.g., the admin area or links to pages that you only have for friends and acquaintances on the net).

The file is stored in the root directory of your website and should only be used to exclude individual pages. You don’t need to tell Google & Co. what should be indexed, only what should not be indexed.

Related posts

Google is Releasing a Broad Core Algorithm Update

admin

Few Tools to Help You Increase Your Blog’s Performance

admin

10 Steps in a Great SEO Action Plan

admin

Leave a Comment

No any image found. Please check it again or try with another instagram account.