New analogy updated in google called EVERFLUX replacing UPDATEFRITZ

Update Fritz was the beginning of the transition from a monthly update to an incremental index. It caused a lot of comments, because plenty of people were happy with an index that only changed once a month. But since this was not quite good enough for evaluating pages once in a month an update was done a summer when there is low search engine traffic due to seasonality ,this update called everflux is expected to have a certain amount of change to the index every day or so. Note: sites shouldn’t use “&id=” as a parameter if they want maximal Googlebot crawlage . If many sites use “&id=” with session IDs then Googlebot usually avoids urls with that parameter so check up with your site session id and remove usage in that fashion for maximum crawlage

Updations of back links and page rank do they happen at the same time

Whether back links or Page Ranks are also being updated all at the same time? In general, Search engine wouldn’t fixate too much on back links or Page Rank during an update. The external back links/PR that we show has at least a couple factors that complicate analysis. First off, they are a snapshot in time, and the actual backlinks/PageRank used for ranking are from a different time interval. Another complicating factor is that we don’t show every backlink on Google; we only show a subset of the backlinks we have internally.

Issues using Iframes

Google does it penalize sites using iframes?

Google does not penalize sites using iframes. Plenty of legit sites use iframes, so it wouldn’t make sense to penalize for it. Some search engine spiders would have trouble with iframes just like some spiders have trouble with frames. But nothing expected iframes to cause any penalties.

Strategy on improving link exposure to search engines

Try to use absolute links instead of relative links, because there’s less chance for a spider (not just Google, but any spider) to get confused. In the same fashion, try to be consistent on your internal linking. Once you’ve picked a root page and decided on www vs. non-www, make sure that all your links follow the same convention and point to the root page that you picked. Also, I would use a 301 redirect or rewrite so that your root page doesn’t appear twice. For example, if you select http://www.yourdomain.com/ as your root page, then if
a spider tries to fetch http://yourdomain.com/ (without the www), your web server
should do a permanent (301) redirect to your root page at http://www.yourdomain.com/
So the high-order bits to bear in mind are- make it as easy as possible for search engines and spiders; save calculation
by giving absolute instead of relative links.- be consistent. Make a decision on www vs. non-www and follow the same conventionconsistently for all the links on your site.- Use permanent redirects to keep spiders fetching the correct page.

Those rules of thumb will serve you well no matter what with every search engine,
not just with Google. Of course, the vast majority of the time a search engine
will handle a situation correctly, but anything that you can do to reduce the
chance of a problem is a good idea. If you don’t see any problems with your
existing site, it is not necessary that you go back and change or rewrite links.
But it’s something good to bear in mind when you go for making new sites, for
example.