[Roboxts.txt|http://www.robotstxt.org/] file plays an important role in SEO. Which tells the search engines to what all URLs can be crawled and not crawlable.
Serving robots.txt is quite challenging. Since both systems are disconnected and altering API to serve these data is super expensive.

Enable RobotsTxt module in Drupal

Enable and configure the [Robots.Txt|https://www.drupal.org/project/robotstxt] file in Drupal side. [The configuration for this module|https://www.drupal.org/files/project-images/robotstxt_configuration.png] is pretty straight forward.

Add a new route in NodeJS server

Create a new route in your index.js - This will be the same file where the express module included and the server related configuration specified.

Here is a quick and neat code snippet.

// Specify Drupal side sitemap location
let robotsTxt = `your_domain_name_com/robots.txt`; 
app.get('/robots.txt', function (req, res) {
    let robotsFile = request(robotsTxt);
    req.pipe(robotsFile); 
    robotsFile.pipe(res);
});

In the above example, we used the pipe method. Piping is a great method in which you can read data from the source URL and write to destination URL without managing the flow yourself.

A similar method can be used to serve the sitemap.xml file from headless Drupal 8 server to NodeJS server.