Skip to Content

Welcome!

Share and discuss the best content and new marketing ideas, build your professional profile and become a better marketer together.

Sign up

You need to be registered to interact with the community.
This question has been flagged
1 Reply
45 Views

When you have an Odoo website and you want to add Google softwares such as “Shopping feed Optimization” or Google Ads, Google might need to crawl all the website’s pages. An error can then occur with the default configuration of the robot.txt file in Odoo. This situation happened in v13, it might be fixed with the next versions.

Avatar
Discard

Here is how to solve this problem.

We have to modify the current robot.txt file directly in Odoo.

To do so, activate the developer mode.

Then in general settings, go in Technical, then “Views” under the “User interface” tab.

Search for a view named “robots” and modify the architecture.

By default you have the following content

“User-agent: *”

We are going to erase this line and add the following ones:

“User-agent: Googlebot

Disallow:

User-agent: Googlebot-image

Disallow:”

Save the changes and now Google should be able to access all your website’s pages.

Avatar
Discard

Your Answer

Please try to give a substantial answer. If you wanted to comment on the question or answer, just use the commenting tool. Please remember that you can always revise your answers - no need to answer the same question twice. Also, please don't forget to vote - it really helps to select the best questions and answers!