A good evaluation of some of the more popular open source CMS software packages available.
Google just implemented another beta search engine that should not be ignored for internet marketers. With this new feature, Google is in a way stating that blogs are important enough to warrant it’s own search engine. Most blogging software will ping a blog service when there is a new post. Google is taking advantage of this feature and will list a blog and spider it for new content based on these pings. The result is that if a blog is updated regularly, it will be spiders and indexed regularly. This new search engine is making it ever more important to blog as a way to promote your website.
Google search engine has implemented a Beta version of their sitemap service in June, 2005. This allows the website owner to create a sitemap using XML and submit it to Google. When the Google spider crawls the site, the information provided by the sitemap will be used. With this sitemap, you can specify priority, last modified, and frequency of modification. This will help Google figure out how often to re-crawl a page for changes and its importance. This does not replace their old method of searching and indexing pages, but rather supplements.
At this time, the feature is new and the advantages and disadvantages of implementing a sitemap have yet to be determined. However, since Google is offering a new feature, it is in their best interest to make it an advantage for website owners in order to have this new initiative take off.
I just came across a website that is designed beautifully but will have a lot of problems getting listed on search engines. First of all, the site only uses Flash for navigation. At this time, search engines cannot read content in Flash files so most of the pages on the site are hidden. Secondly, all the content pages are displayed in a frame. The problem with this is that if the search engine found a content page, there would be no navigation to go to another page on the site as the navigation is not part of the content page. In effect, making it a dead end at the site and the user will have to hit the back button or type in another URL to leave the site. Not a very good eyeball retention technique.
How unfortunate that some very esthetically pleasing sites will rarely be viewed.
A lot of websites are built before any thought is put into marketing. The fact is, a lot of what is needed to properly market a website are deeply tied to how a website is designed and built. The way links are used, the organization of data or information, the type of technology used, etc. It would be much more effective if a website was built with this in mind instead of an after thought.
Many webmasters have noticed strange activities in their server logs recently. It seems that bots or viruses are testing their email scripts for vulnerabilities. In many cases they do exist. The bots hack the sites by entering data into the email script and sending emails to whomever they wish in effect making your website a spam site.
The problem this poses to you is that your website might be blacklisted by other services like AOL, Yahoo, etc. When their users get spam, they can easily click a button to tag it as spam and have it reviewed by the email service provider. If enough emails from your IP address where your website is hosted are tagged as spam, the email service provider will notify your ISP that the IP address will be blacklisted if action is not taken. Your ISP will usually suspend your account at this point and your website will be down.
Make sure your email scripts on your website are secure. Create some checks and balances when the script tries to send an email. Make sure your script checks that the email being sent is legitimate based on data you have in your system or business logic. For instance, if you know your website only sends emails to people who are in your database, then check for it.