What meaning in using robots file on pages on Github?

According to this Stack Overflow response it is possible to use robots.txt to prevent search engines from indexing pages that the webmaster does not want.

How is this possible?

Through a Custom Domain or custom domain.

However, if the purpose of robots is to delimit a private area on the site(for example), what is the point in trying to "hide"(with the robots ) Any content? if everything can be seen freely on the Github platform?

For content to be effectively hidden or hidden, then would it be necessary to pay for the GitHub platform and have access to the private repositories feature?

Author: Jonathas B. C., 2017-11-11