It just comes to my mind that I am writing posts about web site development issues only when something bothers me. Like the DNS problems with my ISP that I had in July this year.
It might not only be me who has noticed that many “Social Media Rich Web Sites” and blogs load very slowly. Even in a modern browser on a new computer.
The first guess is:
“It's all the widgets and external components they load.”
But it is not only how much you load. It is also about in what order you load it … and from what servers … So what do we usually have to deal with:
various CSS style sheets
The browser has to request each file individually. Some files are cached in your browser and will be downloaded only once (if your web server is setup correctly).
Then there are other cases where your web server might have to wait for data from third party web sites before it can complete to build a dynamic web page and deliver it to the browser.
The richness of Web 2.0 doesn't make it necessarily easier to be a great webmaster. It's true that widgets can be dropped into a site and add substantial interactivity to it in matter of minutes, but optimizing a site for performance hasn't become much easier.
A lot of different elements to deal with.
affiliate banners or affiliate data feeds
RSS news feeds
embedded video and audio content
Digg, Reddit, and other social media votes
Twitter, Facebook and Disqus widgets
User avatars like from myblogcatalog.com or gravatar.com
The list is really endless.
And last but not least the HTML part of the page itself (this one single file) could be already pretty big by itself. Or don't you have 20+ comments on your average blog post?
And don't forget, if you are running a content management system like a blog all this content is created on the fly out of a database. (Or are you using Wordpress Super Cache already — or whatever it is called.)
You will notice the longer you have been working on your website the slower it usually gets.
I did not write about Web Development for quite a while. After reading an article about the use of X-Robot Tags today, I though it's time, again.
You retrieve content from the Web by typing a Web address into your browser and the addressed Web server will send you the requested resource, like a regular HTML Web page, a PDF document, a JPEG image, a video or Flash Movie, a XML file, etc.
The browser and Web server communicate using he HTTP (Hypertext Transfer Protocol ↑) and before the requested data is actually sent, they exchange HTTP Request and HTTP Response headers with information about the document, the browser, and the server. The X-Robots-Tag Directive is an optional element (a directive) of such a HTTP Response Header. It was introduced by Google this year. Now, Yahoo has announced 2 weeks ago, that they support it, too.
You might recall that there is a (X)HTML Meta Tag that allows to restrict the access control for search engine. But they only work for (X)HTML documents. Now the X-Robots-Tag Directive allows the same for any non-(X)HTML resource like video-, audio-files, images, etc.
Welcome to the December 3, 2007 edition of Webmaster Articles Blog Carnival!
My name is John and I am your host. This edition is packed with a great variety of articles. You'll meet “old friends”, as well as new fellows in the carnival scene. I am hosting this carnival for the first time, and I was pretty much impressed by the high quality of the submitted posts and Blogs. No problem with Spam Blogs (Splogs) at all. Might be the technology part of this carnival that doesn't attract so many spammers as compared to the business related carnivals.
Thanks to everyone who contributed.
Each of the 41 articles below is handpicked. I suggest that you do not only look at the articles, look at those Blogs, too. I have found a lot of great recent posts on those Blogs, too. Check it out. Maybe this carnival should be run at least every month. My comments are open, so just say what you think.
Question of the day: Who will be the next host for this carnival?
Write a note to the organizer. Get more exposure for your Blog Volunteers preferred.
You also can submit your article to the next edition already. More info at the bottom of this post.
We have the following categories today: Design, Monetization, Traffic-Building, and General
Use the common shared set of rules of languages. It cannot be any easier. Print out that page and you have a valid reference for the most important languages on the Web.
Web masters, Web designers, and programmers do not have an easy job, when it comes to update templates, files, and programs for a Web site. In most cases they will have to build on "code" that has been written by somebody else. Furthermore there are many different languages that have rules of their own.
HTML, XHTML, XML
Sometimes the differences are minor, but the interpreter, browser, or validation tool might complain, if the syntax or content is not correct 100%.
Have you ever read about META Tags, TITLE Tags, DESCRIPTIONS, etc. in Search Engine Optimization (SEO) guidelines?
Most of them are very vague and don't give you any examples. Even worse are most of the free and even paid so-called Meta Tag Optimizers. Stay away from them! They will cause more damage than good. They usually don't provide more functionality than your text editor. Really! Some services will simply email you back the document that you have completed by yourself in their Web form. What a waste of time. Here is the deal: