Many SEOs try various methods to compel Googlebot to re-crawl the desired URLs on demand such as using free pinging services or by using links to their websites, but forget that Google already offers a fantastic re-crawling tool from its tool box.
If you want a specific URL to be re-crawled after making vital changes or after a virus attack and do not want to wait for the scheduled crawl then you can use the updated Fetch as Googlebot in Webmaster Tools that contains the re-crawl feature displayed as Submit URL to Index.
Once you use the Submit URL to Index feature, you can be assured of almost instantaneous indexing results. If you have recently added or repaired a URL then you need to visit the Fetch as Googlebot section and instruct Google to fetch the desired URL. Once that step is successfully completed then you should use the Submit URL to Index to compel Google to re-crawl the URL.
However, you should remember to choose the URL and all linked pages option to ensure re-crawling of internal links within the page. You should also ensure that all re-crawled pages contain maximum internal links to ensure that the robot does a detailed re-crawl.
The advantage of re-crawls is that you can now find errors almost instantly and rectify them quickly as compared to waiting for a scheduled crawl that could delay this action by days or weeks. You can ensure that your clients do not need to wait for a long time to get their issues fixed, which in turn will boost your reputation.
The only hitch is that your index submissions are limited to 10 each month for each account so if you have several client websites in a single account then that could pose a problem. However, this handy tool from Google is perfect to get your website back on its feet and get it indexed once again by compelling Googlebot to obey your command.