Skip to main content

Issues with Tridion

Updated on 7/24:

We use the tridion CMS, which is a high end CMS product, we have had a lot of trouble with the product in the past. (www.tridion.com) The content manager is a strange beast which uses a combination of vb, .net, and other technologies. Its always breaking, and is not reliable. We need to debug messed up stuff on it on a regular basis. We have the systems under change control, but it still seems to manage to break easily. The good news is that it spits out generated jsp pages, which seem very reliable running on Resin application servers.

The support is always excellent and responsive, which helps us deal with code issues that we have with development. In the last case they actually went out of their way to get our content database and replicate the issue pointing to our code. This is something that few vendors would do for a small customer as we are. It still doesn't make up for the strange design of the system. I am not sure if this is related to our implementation or a problem with the product itself.

Here is a snippet from my emails with support:


It is not supported to run other versions of .Net on the Content Management server - only the version listed in section 2.2.d of the "SDL Tridion R5 Product Prerequisites 5.3.pdf" (Microsoft .NET Framework version 2.0) document are supported. Please uninstall all versions of .Net Framework, reboot, and install the supported version.


Last time I checked Microsoft made .NET fully backwards compatible, and even keeps the frameworks in different folders:

For example on my machine:

Directory of C:\Windows\Microsoft.NET\Framework

04/22/2009 02:17 AM v1.0.3705
04/22/2009 02:17 AM v1.1.4322
06/29/2009 03:14 PM v2.0.50727
04/22/2009 05:01 AM v3.0
05/18/2009 02:58 PM v3.5
06/29/2009 11:08 AM VJSharp

Since my last post was a bit too harsh, and I didn't credit where I meant to credit I now have to meet with the Tridion folks on Monday... what did I get myself and my poor colleagues into. Oops...

Comments

Popular posts from this blog

Dynatrace Growth Misinformation

For my valued readers: I wanted to point out some issues I’ve recently seen in the public domain. As a Gartner analyst, I heard many claims about 200% growth, and all kind of data points which have little basis in fact. When those vendors are asked what actual numbers they are basing those growth claims on, often the questions are dodged. Dynatrace, recently used the Gartner name and brand in a press release. In Its First Year as an Independent Company, Gartner Ranks Dynatrace #1 in APM Market http://www.prweb.com/releases/2015/06/prweb12773790.htm I want to clarify the issues in their statements based on the actual Gartner facts published by Gartner in its Market Share data: Dynatrace says in their press release: “expand globally with more than three times the revenue of other new generation APM vendors” First, let’s look at how new the various technologies are: Dynatrace Data Center RUM (DCRUM) is based on the Adlex technology acquired in 2005, but was cr...

Misunderstanding "Open Tracing" for the Enterprise

When first hearing of the OpenTracing project in 2016 there was excitement, finally an open standard for tracing. First, what is a trace? A trace is following a transaction from different services to build an end to end picture. The latency of each transaction segment is captured to determine which is slow, or causing performance issues. The trace may also include metadata such as metrics and logs, more on that later. Great, so if this is open this will solve all interoperability issues we have, and allow me to use multiple APM and tracing tools at once? It will help avoid vendor or project lock-in, unlock cloud services which are opaque or invisible? Nope! Why not? Today there are so many different implementations of tracing providing end to end transaction monitoring, and the reason why is that each project or vendor has different capabilities and use cases for the traces. Most tool users don't need to know the implementation details, but when manually instrumenting wi...

IBM Pulse 2008 - Review

I spent Monday-Wednesday at IBM Pulse in Orlando. It was a good show, but quite a few of the sessions were full when I arrived. It was frustrating because they didn't offer them more than once. The morning sessions were mostly pie in the sky, and not very useful to me. I got to spend a lot of time with senior people in engineering, architecture, and acquisitions/strategy. I also got to meet people I knew from online or other dealings with IBM. Overall, the show was a good use of my time, and I found it enjoyable. Here are some of my highlights: ITM 6.2.1 improvements including agentless capabilities and such. New reporting framework based on BIRT which will be rolling forward. New UI which is being pushed and was on display from TBSM 4.2. Hearing about what other customers are up to (mostly bad decisions from what I've seen). Affirmation of ITNM (Precision) as a best of breed tool, with a excellent roadmap. Some things which are bad and make no sense: Focus on manufactur...