A sitemap is essentially a file hosted among the other files in your website that provides search engines that may visit (crawl) your page with information about the content of your website. The site map protocol typically defines XML structure that includes the website URLs and, optionally, three other pieces of information:
Besides allowing for easier navigation and better visibility by search engines, sitemaps also allow to inform search engines about changes made to your site faster. Sitemaps are especially useful for new websites and blogs, since those usually don’t get many backlinks to their individual pages or posts. The sitemap help search engines crawl their sites more efficiently and to discover all of their pages. In popular websites, sitemaps allows your website to “talk” with search engines in a more efficient way. They help the search engine “guess” how often your site is updated, which parts of it get updated more frequently than others, set the crawl rate for your site, etc.
An example sitemap with only one URL would look like this:
<loc>tag you may specify the three other tags which are
<lastmod>,(in W3C Datetime format)
<changefreq>(for example, monthly) and
<priority>. Make sure your URL’s never point to non-existing pages (that return the error code 404, for example), since it breaks the credibility of the sitemap. That is part of the reason why it’s best to use an automatic sitemap generator to regularly track changes to your website.
The first step would be to be sure your sitemap is up to date to begin with - and has all the URLs you want (and not any you don't want). The main thing is none of them should 404 and then beyond that, yes, they should return 200's. Unless you're dealing with a gigantic site which might be hard to maintain, in theory there shouldn't be errors in sitemaps if you have the correct URLs in there. Getting sitemaps right on a large site made a huge difference to the crawl rate and a huge indexation to follow.