Friday, April 30, 2010

Upgrade Land for Microsoft - Sharepoint / Exchange 2010, and JIRA 4.1

    I have been using Office 2010 for a while, and moved from the preview, to beta, and now I am finally on the release. We decided corporately to stick to 32 bit even when we are on 64 bit windows 7 on our newer systems. The main reason for staying on 32 bit was the all of the addons in the market are written for 32 bit only. When I was testing on 64 bit, I wished I had just stuck to 32. The released version has been stable for the last few days, but I didn't have much issues with the beta release either.

    Now that we have office underway we are beginning upgrades to the other 2010 products we use from Microsoft. The first one was Sharepoint, which we are MOSS 2007 right now. The migration was slightly painful, and here are some of the pointers that I found helpful in the migration.

    The next step is a bunch of testing, and hopefully cutting over next weekend to the new version (5/8/10). We avoided any custom components on our sharepoint, which made the migration much simpler. We have yet to have any complaints with the migrated test data. The new interface is awesome, and works great in Chrome as well. Great job to the Microsoft Team on this product!

    We are in process of an exchange 2010 upgrade as well, we are building out some new VMs and we will migrate the mailboxes over. The work is still initial on that project, so I will post more on that as we go. My colleague is the main lead on that project.

    On another side note, I moved us from JIRA 4.0 to JIRA 4.1. The upgrade was somewhat manual and required some work and planning. The new JIRA interface is very nice, and its good to see them finally changing the old reliable interface they have had for many years. Now if they would only fix the UI for the admin section so I could stop scrolling on a huge list that would be great!


 

Tridion Upgrade 2009 SP1

We have decided after some pain to give Tridion another go over here. We have some really sharp guys helping us from the firm, and they have helped us immensely. We just upgraded to the newest version, and after the struggle to get it running initially it's gone very smoothly and simply. Within a couple hours we moved everything over to the new version and its working flawlessly. It was very simple and good to see the quality of the installers. They handled pretty much everything without any additional manual steps. We are looking forward to moving to the new version later this summer as we beta test for them.

    Lots of the issues with the product were due to the implementation that was designed for us. We will be redoing our site and building it properly using the new version. I think with proper guidance and a good technical team we will not have the issues from the past. We are also moving a lot of our custom code from the current codebase into a web services layer that will isolate the our code from the main Tridion content. I am looking forward to the project.

    There are lots of other things going on today, new database server swap for our performance testing, and a bunch of other project work. Its good that its quiet in the office as far as non-project work goes.

Monday, April 19, 2010

Week in Geneva

Just wrapping up a week of pretty intense work here in our datacenter, here is a list of some of the fun projects we accomplished.

  1. Disk upgrades to netapp
    1. Netapp locally here in Switzerland went out of their way to fix issues caused by my purchase in the US. Last time I buy in US and ship overseas.
    2. Netapp also looked over the system and made some very good corrections and suggestions, much appreciated the great customer support.
  2. Reconfigured network
    1. Move 10GE to other subnets
    2. Change netapp network config
    3. Several other additional cables and infrastructure was built out
  3. Firewall Upgrades
  4. F5 Upgrades from OSv9.4.3 to OSv10.1
  5. Install 3 New VM Servers
  6. Install memory in systems (DB, VM)
  7. Cleanup of office, and build other infrastructure
  8. Major failover testing of netapp, firewall, and loadbalancers

Now we are trying to get home with the volcanic ash situation in Europe. It looks like we will be driving our rental car to Barcelona, and taking a flight from there. Should be an interesting little side trip.

More fun later, glad to have a little break after working crazy hours the last week. J

Tuesday, April 13, 2010

Finally a way to block those pesky bots stealing content

We've been using a product over at MFG which is sort of like an invisible captcha tool. The beauty of the product is the end user doesn't even know its running, but the accuracy and technology which is used is very unique and cutting edge. We first started speaking with Pramana – www.pramana.com about over a year ago, initially there was issues with the technology, but it had progressed quickly and become rock solid. I was unable to get false positives in all my testing and scripting.

We implemented the technology (Pramana HumanPresent - www.pramana.com/human-present/) based on issues with competitors which sell databases and information about manufacturing companies essentially stealing our content. They use various methods, including screen scraping, and seo scraping bots. This has been observed in many occasions, and we even had one company who wanted to sell out to us, while they were stealing our data! (somewhat legally)

The product is not super simple to implement, but the benefits are great. They have SDKs for a bunch of languages (for us we use Java, which is more complex than the PHP API or others they have). The SDKs give you all kinds of granular control.

We are a paying customer of Pramana, and they got the great idea of letting users use the service for free (Called BotAlert - http://www.pramana.com/botalert/) in order to detect and measure the bots (you get pretty daily reports from them), if you want to block the bots then you have to pay. The cost is very reasonable considering it doesn't inconvenience users, and it can also allow search engine crawlers to index content, but homebuilt screen scrapers to be blocked.