⟩ Is there any way to prevent a spider from grabbing URLs that you want to keep off search engines?
Absolutely - there are many ways, and you should use them all. For a quick overview, search for robots or spiders on Google, visit The Web Robots Page, or visit B.4 Notes on helping search engines index your website from the World Wide Web Consortium. The most fool-proof method to block spidders is to password protect any files that you don't want indexed by the search engines. See Can a search engine index pages that are password protected?
In general, you should create a robots.txt file for the root folder on your site, use the robots meta tag on pages you don't want indexed, and password protect any files you're serious about protecting.
Here is a robots meta tag:
<meta name="robots" content="noindex, nofollow">