Recent searches


No recent searches

Setting up federated search in your help center using a search crawler or the API



image avatar

Elizabeth Williams

Zendesk Documentation Team

Edited Feb 10, 2025


4

22

22 comments

well-documented content 👍

0


Hello, do you know if the search crawler can be used to index content in a Jira Confluence site ? Many thanks !

0


image avatar

Jeff C

Zendesk Customer Care

Hello Julien,

Yes! The crawler can be used to index content from your Confluence site. If you are running into issues setting it up, please reach out to us directly for support.

0


Hey Julien SERVILLAT did you manage to get this set up and working?
@... we were told this wasn't possible, so am getting conflicting information...

0


Hello Matt Farrington-Smith, yes we managed to setup the crawler with Confluence. Indexing of the article works, we are just finalizing the upgrade of the theme to include the correct placeholders to return the results of federated search.

0


hello, the content of the crawler, after a search, is presented embedded or it is just a link that pops-out a window or tab to the external content? 

I ask this because I am wondering if it make sense to host HTML content in a s3 bucket, without custom domain. if the crawler embeds the content, it doesn't matter not having a custom domain. different to the case if it pops-it out.

I suspect not using the custom domain may create problems like the domain-verification rule..

I hope my question makes sense!

 

0


Hi Cesar, and welcome to the community! 
 
When users perform a search, relevant external content discovered by the crawler is ranked and presented on the search results page, where users can filter the results and click the links to view the external content link in another browser tab.
 
For more information, see About Zendesk Federated Search

0


Hi, 

Do we have guide on deploying search crawlers on MS Sharepoint?

0


Hello, Is the crawled external visibility set to Everyone? Is there a way to control the visibility settings of the crawled external content?
We'd like to limit the crawled external content to agents and admins for one of our use cases but based on what's available in the settings, I assume that it's not possible.

0


I'm having difficulties getting external content to show up for everyone in search results. I have enabled it and it is verified that the crawler is working. Am I missing something?

0


When configuring the crawler, if you designate a site while including a subdomain, will the crawler cover the entire domain or just the subdomain?

Is the crawler's verification tag persistent across crawlers or is a new one generated for each crawler? For instance, if I start configuring a crawler, copy the tag, but don't save the crawler because I don't have the sitemap URL, will that tag be the same when I go back to finish creating the crawler?

1


image avatar

Dane

Zendesk Engineering

Hi Sarah,

This is visible to Everyone and it is not possible to restrict it the same way as user segment for Help Center articles.
 
Hi mfg,
 
For every domain that you will designate for the crawler, it will have a different verification tag. Same goes if you will create another crawler for the same domain that has already been verified.

0


Has anyone attempted this with Madcap/Flare?   We're testing currently and and we're unable to verify the domain in our POC for a <companyname>.mcoutput.com domain.  I'm guessing it is because we don't own the Madcap domain of mcoutput.com, am I right?  

1


We are using an eccommerce platform that has their site index as storename/xmlsitemap.php

looks like this requires it to be .xml? google crawler has no issue with our sitemap. What can i do?

0


Hi.

I am trying to set up a search crawler for our help desk. I have embedded the meta tag in the header of the target webpage, the url for the site and for the sitemap are both correct, but It still gives me a warning saying that the domain cannot be verified and that it cannot find the sitemap. What can I do? 

1


Hey Elizabeth Williams

I want to initiate re-indexing URL's for google. Will this help me with that too?

0


image avatar

Destiny

Zendesk Customer Care

Hello Rajan,
 
Thank you for getting in touch.
 
It seems you're interested in whether reindexing URLs with Google will enhance the visibility of your pages in search results. Have I understood that correctly?
 
It's important to note that the process of reindexing URLs through Google Search Console is separate from setting up Zendesk's search crawler. They serve distinct functions. Could you clarify your specific requirements or the goal you're aiming to achieve? This will enable me to offer more tailored advice.

0


Hello. We have a load problem on our own sites that we index with Zendesk crawlers, and it's apparently not possible to customize/modify the frequency or speed of the crawlers. Is an evolution possible or planned?

0


image avatar

Shawna James

Community Product Feedback Specialist

Hey Anne-Flore, thank you so much for taking the time to provide us with your feedback here. I wanted to point you in the direction of our product feedback forum for Guide where we collect product feedback and where our PM’s review feedback from our customers. We would greatly appreciate you using this forum to share your feedback with us for better visibility. Here is our product feedback template to get you started. Thank you!

0


image avatar

Sebastiaan Cools

Zendesk Luminary

is it possible to have the crawler search a SharePoint site with authentication?

0


unfortunatly it is not possible to use script generated XMLs like /zendesk/sitemap.xml?path=/topic/*&template=tutorial-article. Can you say why there are restrictions?

0


image avatar

Destiny

Zendesk Customer Care

Hi Robert,
 
Thank you for your inquiry regarding the use of script-generated XMLs, such as /zendesk/sitemap.xml?path=/topic/*&template=tutorial-article. I understand that these restrictions can be frustrating, and I’d like to provide some insight into why they are in place.
 
  1. Security Concerns: Allowing unrestricted access to certain paths or templates may expose sensitive information or internal structures, which could be exploited. These restrictions help maintain the integrity and security of our application.
     
  2. Performance Issues: Generating XML files dynamically can be resource-intensive, especially with a large number of articles or topics. To ensure optimal performance, the system must manage requests efficiently, and allowing dynamic parameters could complicate this process.
     
  3. API Limitations: Our API is designed to prioritize reliability and simplicity. Complex queries can introduce unpredictability, so certain features or endpoints may be restricted to maintain stability.
     
  4. Content Management Policies: We enforce specific content structures and templates to ensure consistency across our platform. This helps maintain quality and relevance, particularly for SEO purposes, as sitemaps are critical for indexing by search engines.
     
  5. Technical Constraints: Supporting dynamic paths may require significant changes to our architecture, which could introduce bugs or necessitate extensive testing. Additionally, the generation of XML files may depend on systems that are not designed to handle such complexity.
     
I hope this explanation clarifies the reasons behind the restrictions on using dynamically generated XMLs. If you have any further questions or need assistance with alternative solutions, please feel free to reach out.

0


Please sign in to leave a comment.