Hello everyone. It's time for Google Search Office Hours with Google Shin once again. I will be the one delivering it today. Thank you very much for your cooperation yes. Okay, so let's start with a request from Google. yes. The book is a hashtag, you know. We use Google Search Office Hours. Therefore, if you have any opinions or comments regarding the question, please feel free to post them using the hashtag above (Google search). And, the link to the article I introduced will be posted later in the description box of this video. Also, it would be a great encouragement to our team, so if you enjoyed the video, please give it a thumbs up and subscribe to our channel. Now, let's move on to the announcement from Google. Okay, so regarding the main blog posts, well, these four articles have been published since the last festival hour. So, we're going to streamline Search Console analysis with a new AI-powered configuration. We've added the "Recruit" and "Moon View" features to Search Console. Here are the highlights and thank you messages from Search Console Live, oh, Search Central Live AK2025, and an announcement regarding Google's Discover Bar Core update in February 2026. Well, I'd like to touch on each of them a little bit at a time. First of all, this involves streamlining Search Console analysis with a new AI-powered configuration. While 's performance reports are a powerful tool for analyzing organic search traffic, finding the necessary data accurately can sometimes take longer than expected. So, well, at the end of last year, a trial version of the feature was added to the performance report. This feature aims to reduce the effort required for data selection, filtering, and comparison. By utilizing A, this feature allows you to describe the analysis you want to display in natural language. For more details, please read this blog post. Please give it a try. And the second blog has added a view to Search Console for recruiting members, and analyzing search traffic trends is, well, an important element of SEO. So, detailed daily data is essential for understanding the problems we face and anticipating increases in the number of balls. However, it can sometimes be difficult to grasp the overall picture. So, until now, the graphs in the performance report have shown no data, but this blog post is announcing that a new feature, a monthly view, has been added to the Console's search performance report as of the date this blog post was published. By using this feature, you can adjust the time aggregation of the performance graph, smooth out daily changes, and perform a more focused analysis of the overall trends in website traffic. Well, for more details, I recommend you check out this blog. Oh, please do try this one too. And, uh, the third one, uh, I'd be happy if you all could remember this one as well, is the Search Centre Live AC2025 highlights and words of gratitude. Oh, right. Well, we, the eSearch Central Live team, have been extremely busy during the latter half of last year. Well, as part of package tours, I participated in some, and not all of them, but we held events in various countries, Hong Kong, and Tokyo, and I have published summary articles about those events. I recall that it was an article from around the end of the year, so yes, that's right. As I mentioned on social media at the end of last year, thank you so much for all your support. And I look forward to your continued support. So, once again, I'd like to express my gratitude. And this is the last blog post I'll be introducing today. A blog post about Google's Discover barcoop dating feature was also published in February 2026. Well, it was quite a while ago, so I think most of you have probably already read it, but we released the Discover Co update for February 2026 in early February. Well
Segment 2 (05:00 - 10:00)
this is a broader, more extensive application of the system that displays articles on Discover, so I think there may be some people who are interested. Oh, by all means, please take a look at their official blog. So, that concludes my introduction to the main blog posts. Okay, then let's move on to the main topic. Oh, yes. Well, first of all, oh, that's right. Let's move on to the first item. Oh, that's the first question. We have received a question regarding the control of text search plan images. We support e-commerce sites. When searching by product name, the SNPE (Sample Page) may sometimes display images of related products instead of the target product as thumbnails. I've prepared a thumbnail image for the target product, specified the image URL of the target product as the meta name "thumbnail," and set the related items to span data "no limit," but related products are still sometimes displayed. They would appreciate it if you could provide any solutions or better methods. Oh, thank you for your question. So, this is something that I get asked about often, but you can't control the images in text search results. So, here, uh, I've linked to the official blog, but as stated in the official documentation, please refer to Google Image Search SEOBest for optimization. Normally, the content would end here, but just the other day, about two days before the recording, a section on using metadata to reject the preferred image was added to this document. So, you can select a preferred image by negating the Primaryimageofpage prop, or by using the King ImageMe tag. Why? It's been updated. It has already been updated in Japanese, but for more details, please refer to the section on specifying the preferred image using metadata in the documentation Thank you for your question. Now let's move on to the next question. We have received a question regarding the display of different statuses in Search Console reports. The URLs being submitted via LightMap are being detected in Search Console and are being categorized as unindexed. However, when I checked the URL using the URL Inspector, it said that Google does not recognize the URL. The question is whether the display in Search Console is simply incorrect, or if there's an implementation issue, such as the sitemap not actually being recognized, or if the sitemap status is incorrect. Thank you for your question as well. So, I checked with the person in charge, and after looking at it, I received the following response: the source used differs depending on the tool, and the update frequency also differs, so inconsistencies can occur between them, but usually these inconsistencies disappear over time. So, I thought it would be best if you could wait a while and then check the report again. Oh, thank you for your question Now let's move on to the next question. We have received a question regarding the properties used when using the link tool. The websites we operate have long implemented link protection as a measure against spam links. Regarding continuing to link to our existing website, which of the following two approaches is correct for uploading the list of linked websites to Search Console? Specifically, should I upload it only to the property on that particular domain? The second step is to also upload the files to the properties of the previous domain that has already been forwarded to the current domain. So, is it possible that spam links directed to the modified domain still exist, and is there a chance that the negative impact of those links could spread to the current domain that the redirects to? Oh, thank you for your question as well. So, to give you a quick answer, please upload the link list only to the Search Console property for your current domain. So, this is the first option in this case. So, if redirection is set up correctly, Google will forward the link signal to the new domain. There is no need to upload to the old domain. Oh, thank you for your question
Segment 3 (10:00 - 15:00)
Now let's move on to the next question. We have received a question regarding the issue where a virtual page, rather than the actual homepage, is being displayed higher in search results. We operate an e-commerce website. Each crime category has its own category top page, where we publish articles that we believe will be beneficial to users. The category top page has links to other pages all in one place, and in terms of keywords, it is considered more useful than a virtual page in search results, but in multiple categories, instead of the top page, a virtual page is displayed, or rather, the choice of page is displayed higher in the search results. I think there might be some kind of penalty that could be the reason why the Borley homepage isn't ranking highly. Could having too many internal links or anchor links in the punk diagram that are directly pointing to product categories have an impact? Thank you for your question. Yes, this is also one of the frequently asked questions. So, if, instead of the category top page, a hypothetical specific information page such as a guide or how to choose is displayed, Google may have determined that the page containing more detailed information is currently more relevant to the user's search. We have received a question regarding the inability to use the public URL for testing. I tried to test the public URLs to check if there were any problems because the pages were not being indexed well according to the URL inspection in Search Console, but an error occurred on a specific page and I was unable to run the test. Additionally, the request for indexing could not be executed. The user is asking what might be causing the public URL to be unavailable for testing, even though they haven't blocked any access and the page speed isn't significantly slow. Oh, thank you for your question as well. And, well, because specific website information was included, I was able to confirm the situation. So, it seems that the problem has now been resolved, or at least it appeared to have been resolved. So, generally speaking, if you encounter errors when testing a public URL, it's highly likely that Google has n't been able to fully retrieve and render the content. So, uh, it was a site where the top page and a specific domain were running on different systems,, in that case, for some reason on the server side hosting the content of that specific domain, uh, for example, uh, it could be a firewall, request blocking, timeout, etc., uh, I think that checking the server logs to see if Googlebot's access is being controlled for such reasons often gives a good prospect of resolving the problem. I hope this is helpful. Thank you for your question. Now, let's move on to the next question. I am asking about the issue of not all content on the site being indexed. After all approximately 400 pages of the newly released overseas subdomain were indexed, the index gradually decreased and was eventually removed from the index with a crawled index of 2. Site map errors and unresolved reflash issues may be affecting indexing. Since all pages have been indexed once, it seems that this is not a crawling issue, and the public URL test is also normal, so it doesn't seem to be a rendering issue either. Are there any possible solutions? Well, I checked with the team in charge, and they said that, generally speaking, it takes time to index new content. So, it seems the status is again " crawled but not indexed," so I think it would be good if you could focus on improving the relevance and quality of the content. Also, please continue working on resolving the site error and unconfigured flanges, and if any further problems remain, please let us know.
Segment 4 (15:00 - 20:00)
Now let's move on to the next question. We have received a question regarding crawled but unindexed content. According to our SEO partner, having too many crawled but unindexed pages can be seen as a sign of low quality for the entire site, which can negatively impact its overall ranking. The question is whether it's true that having too many crawled but unindexed pages can make the entire site appear low-quality. Oh, thank you for your question as well. So, " crawled but not indexed" means that Google crawled the page but determined it wasn't valuable enough to be displayed in search results. So, a high number of these statuses means that Google has decided not to index the majority of your content. However, this doesn't mean that the entire site is being subjected to algorithmic intervention as a result of quality control measures. Therefore, in such cases, our answer is that it would be best if you continue to cooperate in improving the usefulness and quality of individual pages. Thank you for your question. Okay, then, let's move on to the next question. We have received a question regarding the duplicate occurrence of Category E. I run an e-commerce website. Even though the category pages have completely different designs and main introductory texts, they are being counted as duplicates in Search Console. Google has selected the canonical page. We are frequently seeing cases where category pages intended to be indexed are being excluded from the index. So, is this because there's a high proportion of boilerplate elements such as navigation and headers/footers? Alternatively, is it possible that instability in server response speed, such as the delays in image loading, could cause Google to fail to fully retrieve content during crawling, resulting in duplicate detection or exclusion from indexing? Well, this is another question that we occasionally get asked, but in conclusion, I think both of the things the questioner is describing are possible phenomena. Well, if duplicate detection occurs frequently on the category page of the E site, then, as you pointed out, there is a good possibility that the voice up rate ratio is high and becomes 1. In particular, on a website, simply changing the product list or the fat navigation can sometimes lead to it being judged as lacking uniqueness. And, well, there's also the instability of the server auto-system, which could potentially cause the content retrieval itself to fail completely, and as a result, lead to duplicate detection. To emphasize uniqueness, it's effective to ensure that the unique text elements of the main content are loaded early in the HTML. Now, I'm not sure if this is directly related to the current issue, but we also have official documentation on best practices regarding e-sites, so please take a look at this document on recommended methods for making the ICOMAS site easier to find in Google searches; you might find some useful information there. Oh, thank you for your question Now let's move on to the next question. We have received a question regarding how to handle search queries that have variations in wording. A question regarding how Google handles search results. The two words "men's esthetics" and "men's" yield different results in terms of attracting customers and taking breaks. Generally speaking, there isn't much difference, but does Google consider them to be different words? Also, if it only displays as a number of units, is it possible to make it display again using words related to area tax designation? I can't answer questions about specific quests, but generally speaking, Google strives to return the best possible results for users based on its understanding of quests. yes. We have received a question regarding global support for structured data in events.
Segment 5 (20:00 - 25:00)
Regarding Google's event search function, the question is whether there is currently no support for Japan, and if this will be added in the future. Oh, thank you for your interest. So, as stated in the official documentation, Google's event search feature is available worldwide. Currently, Japan, the Japanese language is not supported, but there is a description of the regions and languages in which it is available. Well, we don't have any specific timelines regarding particular languages to announce yet. And, well, I recall this being mentioned as part of the question, but, even if the LithiResult test is valid, test results from unsupported regions will not be displayed. I think it would be helpful to check the section on available regions and languages in the official documentation. Thank you for your question Okay, let's move on to the next question. We have received a question regarding URL design Regarding URL paths, I used to hear that unencoded strings, and complex directory structures negatively impacted SEO. Is that still true today? Would designing URLs to clearly represent resources, like the REST API, be effective? Also, in the case of Japanese text, are URL- encoded and non-URL-encoded pages recognized as separate pages? Oh, thank you for your question as well. So, well, to put it simply, a simple and easy-to-understand path is beneficial for both users and crawlers. So, I think that URL design that clearly indicates the resource, like the REST API, does help understand URLs and is indirectly useful. Regarding Japanese paths, Google recognizes them as the same page regardless of whether they are URL encoded or not, but we do recommend using canonical, or consistent, URLs. Yes, thank you for your question. Now let's move on to the next question. We have received a question regarding understanding the basics of JavaScript SEO. I assume this person is looking at the documentation in English, but this is a question regarding Google's SWAT Central's " Understand Javascript SEO Basics. " So, the award is for Us Lance Metatag Free, and it says "when Google Counterdubbing," but here, what kind of rendering does it refer to? Does it include the initial Dom build? Ah, that's what it says. Oh, thank you for your question as well. So, the noindex tag is detected when Googlebot parses the HTML head section, right? Therefore, if Google detects this tag, it may skip resource fudges, JavaScript execution, and full rendering. Why? Because the initial Dom build is necessary to read the index tag itself, so it's usually done. Yes, that's what I meant. I hope that answers your question The answer to the question, "Why does this include the early Dom build? " is that it is usually done. yes. Okay, thank you for your question. Now let's move on to the next question. We have received a question regarding the description of eye frame tags on the head Third-party JS libraries such as JSIBR or SNS may add tags that would normally be written within the header, such as ``, to the `` section. I recall Google having previously stated that at the point where the frame was added, the head tag might be considered to have ended, and any subsequent descriptions within the head tag might be ignored. However, it appears that the title tags after "Aiframe ame" and the link canonical tags are actually working. In particular, they are
Segment 6 (25:00 - 30:00)
considering the possibility that recovery may be difficult in the case of third-party libraries, and are asking about the extent of research on this matter Thank you for your question. So, well, your question this time is about the level of research, and I think that the priority of the problems to be addressed in any problem depends on the business challenges and the situation in which they are located, so I think there is little we can comment on regarding the assessment. With that in mind, if you've already confirmed that these important tags are being correctly recognized using a URL inspection tool, then perhaps your research level isn't that high. So, well, in this case, it appears that title tags and link canonicals are actually being addressed, so, well, perhaps the level of research isn't that high, but as always, corrections are recommended for the sake of HTML integrity Thank you for your question as well. Now let's move on to the next question. We've received a question regarding the behavior of redirects when visiting Google. Is there a way to investigate the behavior of redirects when a Google bot visits? Due to an incorrect implementation, visits from Google's IP addresses are redirected to a different URL. How can I investigate the behavior of Google visit redirects in cases like this? In this case, if you use Search Console's URL Inspector to check the redirect URL, the redirect destination URL will be inspected, but the fact that a redirect has occurred will not be notified. Also, because it is a massive site, there are a huge number of URLs recorded in the page indexing page, and it was determined that it was impossible to investigate specific URLs. Thank you for your question. So, to investigate cases where redirects only occur for Google WhatsApp IP addresses, the most reliable method is to check the site's server logs and directly examine the server's response to the Google WhatsApp user agent. So, um, or, um, in the codebase or hard database, um, I think it might be effective to check for IP addresses that might be hardcoded, or firewall CDN rules, so I thought it would be good if you could check those things. Thank you for your question. yes. Okay, let's move on to the next question. We have received inquiries regarding the issue where updated article content is not reflected in the test results. We have made significant updates to the article title, description, and content. While all other articles have had their titles and descriptions updated in the search results, this article alone remains unchanged for about a month. Therefore, in order to combat spam links, we tried various prototypes, such as creating new URLs for non-pregnant and elderly users and then implementing 301 redirects. Despite making significant changes, there is still no sign of the search results screen changing. Oh, thank you for your question as well. And, well, since specific queries and websites were included, we were able to verify the information. So, to solve the problem, you said you created content with a new URL and set up a redirect from the previous URL, but when I checked the test results in my environment, it seemed that this redirect had already been removed, and instead, it appeared that you had created another new URL for content. So, uh, yes. And that latest URL was displayed at the top of the construction results. So, does this mean that the problem has been solved? Well, the problem is that 301 redirects are not implemented correctly, and if they are not implemented correctly, problems often arise. Well, this is just a general comment, but please check the official documentation
Segment 7 (30:00 - 35:00)
. Here you can find documentation regarding redirects and Google search. Okay, then, let's move on to the next question. Sorry, I'll have a drink before I'm on camera. yes. Thank you so much for listening; I've been talking non-stop the whole time. Time's up, so let's move on to the next question. We have received questions regarding the current issue of reflecting the latest update date of articles in the results. So, with the aim of improving the CTR, I'd like to reflect the latest update date of the article in the search results. Currently, the H1 meta tag does not have an update date listed. They're asking, "What should I do? " Oh, thank you for your question as well. So, I understand your question is whether you want Google to reflect the estimated date after a WB page is updated or published in the search results. So, as for how to set it up, for details, please refer to the official documentation, specifically the document titled "How Google Search affects the signature date," and I think you'll understand how to write it there. Yes, please do give it a read. Thank you for your question. Now, let's move on to the next question. We have received a question regarding Search Console Big Export. Regarding the big export data in Search Console, I would like to see the average number of impressions for queries where links to our site appear across multiple pages in the search results. I want to do the calculation. For example, if a user searches for "aa," and the first result on the first page is " aa. comabcghtml," and the 15th result on the second page is " aa. comabcHTML," then the question is, if a user performs a search that displays results up to the second page, what would the results be if we used search data side impressions to calculate the number of impressions and average ranking for "AAA"? Oh, thank you for your question as well. Yes, I checked with the person in charge regarding this matter, and yes, I understand. So, as a premise, Bigly Export includes two tables: one that aggregates search data site impressions by property, and another that aggregates search data URLs by URL. So, if two links are available, the site impression will show 1 impression, and the URL impression will show 2 impressions. Regarding the calculation method for the average economic ranking, please refer to the guidelines and reference posted here for more details. In this case, as you understand, you are referring to the data aggregated by search data site impression property, so the number of impressions for AA queries is 1, and the average ranking is 1. Yes, thank you for your question Now let's move on to the next question. This will be our last question for today. We have received a question regarding the time zone in Search Console. It clearly states that search performance is displayed in California local time. So, are the page indexing and crawl statistics, and other dashboards, also displayed in California local time? Oh, thank you for your question as well. So, as stated in the official documentation, well, 24 hours, sorry. If you select 24-hour display, the data will be displayed in your local time zone. The local time zone is based on your browser settings. For other options, all dates will be displayed in Pacific Time (PT), and there is a technology that allows this, so please refer to that. For more details, please refer to the official documentation here, specifically the section on time zones in the performance report. You can find more detailed information there. Well, today's broadcast is the first in about three months
Segment 8 (35:00 - 37:00)
or maybe a little over two months, so we've received a lot of questions. Well, I think I might be speaking a little too fast in order to answer all of those questions, so if there's anything you're curious about, I recommend listening to it a little slower, like at 0. 875 or 0. 5x speed. excuse me. So, that's all for today. Did everyone enjoy this Google Search Office Hours? As I mentioned earlier, we introduced a lot of questions this time. So, the next Google Search Office Hours are scheduled for mid-June 2026. Well, yes, I think some of you may be interested, but this year, it will be difficult to hold office hours on the usual monthly basis, and I apologize for any inconvenience this may cause. Oh, yes. So, well, I'm very happy that you're taking advantage of these office hours and watching them, but if you have any problems, I think it would be a good idea to also check the official documentation again, use the Search Central community, and other official channels as well. As I mentioned in response to the first question, the official documentation sometimes has sections added without much explanation, so I think your understanding will deepen if you check it regularly. Please, please check it out. So, if you have any problems or concerns, please feel free to submit your questions through the homepage. Here, you'll find a list of past videos and a link to the Q&A form. yes. Okay then, I look forward to seeing you again at the next office hours. Oh, please do watch it again sometime! bye