# Rules for Google's crawler User-agent: Googlebot Allow: / # Allow Googlebot to crawl all content by default Disallow: /api/ # Specifically disallow crawling of API endpoints for Googlebot # General rules for all other crawlers User-agent: * Disallow: /api/ # Disallow crawling of API endpoints for all other bots # You can add more restrictive rules for other bots here if needed, for example: # Disallow: /private-directory/ Sitemap: https://stanfordinternational.edu.np/sitemap.xml