SEO практика
223 subscribers
21 photos
2 videos
13 files
1.42K links
Телеграм канал на групата "SEO практика" - практическо SEO.
Download Telegram

Това сa няколко must have скриптове - Semantic search through a vectorized Wikipedia (SentenceBERT) with the Weaviate vector search engine ;)


Кои са SEO техниките, на които трябва да обърнем внимание през 2022?


Има ли възможност да се използва blog cms при custom написан сайт?
Сетих се за варианта да се направи wp инсталация на , но дали при такъв случай ще има положителен ефект на основния домейн

Indexing Paradigm Shift

It's too early to be absolutely certain, but it is my strong suspicion that the changes we have seen in indexing, where many sites at the lower end of the 'global importance' spectrum are not getting content indexed as quickly as previously, or in some cases, having difficulty getting it indexed at all, is not entirely a temporary situation.

You see, it comes after some years of mumblings and rumblings within Google about their issues with quantity over quality spam. And coincidentally, follows right on the heels of vague announcements about new methods of dealing with spam in a spam update.

The problem for Google was sites like Quora, and a hundred other similar sites, where masses of user-generated content could produce literally hundreds of thousands of new pages over the course of just a few hours. Yet the vast majority of the new URLs this created were most often just rehashes of the same discussions on a thousand already indexed threads and URLs from the same site.

Then there is the increasing availability of so-called AI copy writing software. Software that can create huge volumes of very low quality content at almost no cost in resources. Now, Machine-generated Content spam is, despite the bullshit by the so-called AI content software, nothing new. There was software that did much the same thing years and years ago that scraped and spun existing content into new forms according to a set of rules and instructions, and all the new software is doing is calling those rules and instructions "AI".

Google have systems to deal with this.

I just believe that they are improving and extending them. Crawl Prioritization has always existed in Google, and is largely a scaleable, self-regulating thing. Every known URL that Google wants to crawl, either to index for the first time, or to revisit to check for updates, is assigned a place in the long queue according to a priority scoring system. Ultra-high priority pages and revisits get taken care of almost at once, while at the other end, ultra-low priority URLs may wait months or even years, and possibly never get a turn at all, as higher priority items get added to the queue ahead of them.

It has always been quite typical that an 'average' website with 'average' importance and link popularity values for its size would have about 90% of its content indexed, and about 10% not indexed, at any given point in time. Sites such as Amazon and Ebay might have a far, far higher percentage of unindexed product pages simply because they change so quickly, and are often at such deep levels of the site in terms of links to follow.

However, over the past years, I have seen a steady increase in the number of regular, average-ish business sites that have been advised to blog and create fresh content every few days, have a MAJORITY of their content not be freshly indexed and reindexed.

Basically, what PageRank was going into and around the site from the few pages that earned any genuine links was being spread so thin by flowing to all the thousands of essentially pointless blog posts nobody cared about, that the strength in any one page was diluted to the point where it just wasn't seen as important.

My suspicion then is that Google have tightened this up, just a little, to help them find truly worthwhile pages even in a swamp of dross, and at the same time, make it much clearer to quantity spammers that their tactic is self-destructive.

So, all of this means that the following is my advice on how to focus with content going forward:

First, do not publish new stuff just to publish new stuff. If a new page is not built to convert for a specific campaign (e.g. a new product line), or is not a well-thought out piece of content you are sure will attract a bunch of genuine new citations, rethink it.

You need to focus on more results from less pages going forward.
It is better to spend 2-3 months creating one, absolute killer piece of content, such as a major study, a really good survey with expert insights, or otherwise something truly special and remarkable that will gain you a bunch of high-value links and buzz, and have people searching *specifically* for that study/page, than to churn out minor posts a few times a week that gain minor links.

You need to focus on more results from less pages going forward.

Crawl Prioritization was always going to become more and more of an issue over time. That's how power laws work. The rich get richer.

Crawl Priority is a complex system, and some of it is driven by circumstances such as news events, big trends, and circumstances beyond your control. But a lot of it is stuff you can have some control over, or at least, intelligently leverage and influence.

'Importance' of a site is one of the signals for priority, and the factors that show importance are the strength and power (not volume) of links and citations. When a major or local news site mentions you, that is a sign of importance. When your granny mentions what a good boy you are, that isn't. If you go through some incredible shenanigans to get adopted so that you have a thousand 'Grannies', and they all cite what a good boy you are, that STILL isn't important. Focus on quality links, links that can't be bought, or faked, or gained by the worst of your rivals.

If you have pages that get a lot of search impressions, but almost no clicks, consider revising, updating, or else pruning them. If Google see that the pages it already has for your site just rarely perform, they don't tend to adjust the ranking of the site, but they may lower the priority of grabbing any more.

Raise your profile. Be part of communities and conversations. Not to have your contributions there filled with links, but so that you get mentioned more, that you are discussed because of who you collaborated with, assisted, or generally were there for. This shows a level of importance wider than just what you are saying about yourself.

If you are regularly adding content to your site, for promotions, products, or simply news and updates, make absolutely certain that you are gaining links and citations, signals of importance, even faster than you are adding content that dilutes what you have. Focus on earning great citations.

Another part of crawl priority is in supply and demand. Think very, very hard before writing content that effectively already exists out there from a thousand other sources. Focus on content that is hot topically, that is 'on trend', and especially where you have a unique perspective (or can hire one), that separates your content from the masses.

#crawling #indexing #notindexed #crawlpriority

Footnote: This is a repost of something I originally wrote elsewhere (so apologies to those who saw it before), but given how often questions that relate to this are arising, I felt it needed to be available here too.

Ползвате ли и можете ли да препоръчате outreach/blog outreach service/company за външният пазар (EU, US, WW)?


Добро утро, някой работил ли е с Quetext ( за проверка на уникални текстове на английски? Може ли мнения, ако сте запознати, благодаря!
Досега използвах, но нещо се е бъгнал последите дни и търся добри алтернативи.


Нов въпрос (днес съм активен :D):
Някой забелязва ли някаква тенденция Googlebot-a да не си спазва 5 секундното правило при рендиране на скриптове при обхождане на SPA сайтове (съответно да изисква по-бърза реакция, под 5 секунди)?


Сменяме FID, явно:
Google Is Creating A New Core Web Vitals Metric.


През май 2022 г. Alexa, най-популярният сайт за измерване на рейтинга на сайтовете в интернет, ще бъде закрит. Дори не мога да си спомня от колко години съществува, толкова много години заедно. По нея ни намираха рекламодатели, топ 100 хиляди Alexa беше мечтаният праг, за да влезеш в полезрението им.

Sic transit gloria mundi , както се казва .


за да сте в крак с кор ъпдейтите

Два полезни за SEO Телеграм бота:

@dmcamasscheck_bot - пингва, когато домейнът ви получи нова DMCA жалба. Можете да добавяте домейни в големи количества.

@domainsmasscheck_bot - пингва, когато домейнъ му остават < 30 дни до края на регистрацията. Можете да добавяте домейни в големи количества.

Проверка дали инструментът Google Disavow Links (GDL) наистина работи

(експеримент на Ahrefs)

Как го проверяват? Много просто - добавят всички входящи връзки за експерименталните страници в GDL файла и следят видимостта на страниците в Google. При това е важно да се изчакат няколко Core ъпдейта, за да се отстрани шума.

Общо 3476 връзки към 3 страници на блога са отхвърлени от Ahrefs. Всички връзки са отхвърлени наведнъж. Спадът на видимостта се случва бавно в продължение на месец, но ето един интересен момент - видимостта се възстановява много бързо след премахването на този файл.

В експеримента са включени само 3 страници и би било по-интересно да се разгледа зависимостта на целия сайт с по-дълъг период на изпитване. Не трябва да се пренебрегват и други фактори, които биха могли да причинят проблеми с видимостта (променени алгоритми, страници на конкуренти, които са получили повече връзки и т.н.).

Авторите на статията ( ) призовават да използвате GDL файла с повишено внимание и никога да не добавяте всички връзки към него.

⚡️ Заключение: Файлът GDL определено работи, но вероятно ще отнеме повече време тези промени да бъдат взети предвид при класирането.


Ето така се прави линкбилдинг.
Следвайте ме за още добри практики :)


Горд съм да споделя моята статия, публикувана в един от най-популярните международни SEO инструменти - SEO Power Suites, на любимата ми тема туризъм :)


Промени, промени и пак промени...


Здравейте, може ли да споделите опит от първа ръка, при преместване на блог от стардомейн/blog към нов домейн (новорегистриран), с 301 редирект на всички URL адреси, имали ли сте загуба на органичен трафик? Ако да, колко процента и за колко време?