Bookmark this article, print it, save it to your computer.... you may need to pull it out the next time your SEO Manager and Web Team are about ready to face off over the direction of your website. I've witnessed (and been a part of) many tussles between these two groups, each typically approaching web projects with different objectives and perspectives in mind.
How a site is coded can determine how well a search engine spiders it, indexes it, and subsequently ranks it. In the early days, bots or spiders could only crawl and capture front-end text and textual hyperlinks. One could forget about showing up in the SERP's if the site incorporated items such as flash, scripting, or dynamic database strings/references. Things have changed dramatically since those times. For example, Google announced circa 2008 they could spider flash (swf) videos and navigation items. (Although I still wouldn't bet my kids' education funds on Google being able to successfully crawl, index, and display a large flash-based site appropriately.)
The following featured article from SEOmoz further explains how bots have evolved, and also provides insight to what this all means for SEO...