Ajith Prabhakar’s Weblog

A beginner’s guide to Documentum

Introducing Java Code Generator 1.0 A Utility to generate Java Beans from Documentum Objects

Posted by Ajith Prabhakar on September 7, 2009


Java Code Generator generates Java classes from Documentum Object types. Few bullet points about what what this utility does

  • Generates Java Classes from the Documentum Object types
  • All non-inherited Attributes will be member variables of the Generated Java Class
  • Array of the Object type for Repeated Attributes.
  • Class name by default will be capitalized name of the underlying Documentum object type
  • Option to prefix and suffix class name
  • Option to specify the Package name
  • Supports DFS Annotation

After a couple of Beta versions finally I am glad to announce the The Java code Generator. Thanks a lot for all who tried this and send the valuable feedbacks to me. I tried to incorporate most of the suggestions and fix many of the bugs in this version

 images I have added a new DFC version of this tool to the download page.

Click here to Go to Downloads page

Posted in Content Server, DFC, DFS, Documentum, Genaral | Tagged: , , , , , , , , , | Leave a Comment »

Service Based Objects (SBO’s) in Documentum

Posted by Ajith Prabhakar on July 20, 2009


Documentum Business Object Framework which was introduced from Documentum 5.3 plays a key role in most of the current Documentum implementations.  Service based Object is one of the important member of Documentum BOF family.  Lets try to see what makes Service Based Objects very popular and how can you implement it.

What is a SBO

In simple terms SBO in Documentum can be compared to session beans of J2EE environment.  SBO enable the developers to concentrate just on the business logic and all the other aspects will be managed for you by the server. This reduces the application code significantly and reduces lots of complexities. The biggest advantage of a BOF that its deployed in a central repository. The repository maintains this module and DFC ensures that he latest version of the code is delivered to the client automatically.

Service Based Objects are repository and object type in-depended that means the Same SBO can be used  by multiple Documentum repositories and can It can retrieve and do operations on different object types. SBO’s can also access external resources for example a Mail server or a LDAP server. Prior to the introduction of Documentum Foundation Services SBO’s were commonly used exposed to expose documentum web services.

An SBO can call another SBO or by any Type based Objects. (Type Based Objects (TBO) are a different kind of Business Object types which I will explain in a different study note)

A very simple to understand example for a SBO implementation would be a Zip code Validator. Multiple object types might have Zip code across multiple repositories.  So if this functionality is exposed as a SBO it can be used by the custom application irrespective of Object types and repositories. This Validator SBO can be used even by different TBO’s for validations.

Here are some bullet points about SBO’s for easy remembering

  • SBO’s are part of Documentum Business Object framework
  • SBO’s are not associated with any repositories
  • SBO’s are not associated with any Documentum object types.
  • SBO information is stored in repositories designated as Global Registry.
  • SBO’s are stored in /System/Modules/SBO/<sbo_name> folder of repository. <sbo_name> is the name of SBO.
  • Each folder in /System/Modules/SBO/ corresponds to a individual SBO

How to implement a SBO using Composer

The steps to create a SBO are these.

1) Create a interface that extends IDfService define your business method
2) Create the implementation class implement write your business logic, This class should extend DfService and implement the interface defined in Step 1
3) Create a jar file for the created Interface and another jar for the implementation class then create Jar Definitions
4) Create a SBO Module and Deploy your Documentum Archive using Documentum Composer (Application builder for older versions)

Lets see these steps with an Example SBO Zip Code Setter, I am not covering the steps using application builder here. The screenshots and the notes will give you an insight about how to use Documentum Composer to implement a Service Based Object in Documentum version 6 or above.

Step 1 : Create an interface and define your Business method

The first step is to create a interface which will define the business functionality. This interface should extend IDfService interface. Client application will use this interface to instantiate the SBO.

Click New –> Interface in Documentum Composer. Click on the Add button of Extended Interfaces and search for IDfService. Select IDfService and click OK

image

Now Add the Business method ValidateZipCode() to interface. The code should look like the following.

package com.ajithp.studynotes.sbo;

import com.documentum.fc.client.IDfService;

import com.documentum.fc.client.IDfSysObject;

import com.documentum.fc.common.DfException;

public interface IZipValidatorSBO extends IDfService {

 public void validateZipCode (IDfSysObject obj, String zipCode, String repository)throws DfException;

}
Step 2 : Create the implementation class

All the Service Based Object implementation classes should extend from DfService class and implement the Interface created in the first step.  DfService class is an abstract class There are few methods which were abstract in 5.3 and has provided with a default implementation in 6.0 and later

Method Name Returns More information
getVendorString() String This method’s default implementation returns a empty String. Override to make changes to it.
getVersion() String This method returns a version which is not right, Override this method to return Major.minor version.
isCompatible() boolean The default implementation returns true if the version is an exact match

Lets see some other important methods of DfService Class before we move further.

Method Name Returns More information
getName() String This returns the fully qualified logical name of the service interface
getSession() IDfSession This method returns IDfsession Object for the docbase name which is passed as argument to this method. You have to make sure that you call releaseSession() after you are done with the operation that involves session.
releaseSession()   Releases the handle to the session reference passed to this method.
getSessionManager() IDfSessionManager Returns the session manager.

Managing repository sessions in SBO As We saw the the previous table its always good practice to release the repository session as soon as you are done with its use. So the ideal way to do this should be like this.

// Get the session 

IDfSession session = getSession(repoNam);

try {

// do the operation with session

} catch (Exception e){

// Process the exception 

}finally {

// release the session 

releaseSession(session)

}

Transactions in SBO

Another important thing is to know is  how to handle transactions in SBO. Note that only session manager transactions can be used in a SBO. System will throw an Exception when a session based transaction used within a SBO.

beginTransaction() will start a new Transaction and use commitTransaction() to commit it or abortTransaction() to abort a transaction.  Always ensure that you are not  beginning a transaction where another transaction is active. You can use isTransactionActive() to find out whether a transaction is active or not.

Another important point is if your SBO doesn’t start a transaction don’t commit it or abort it in the SBO Code instead if you want to abort the transaction use setTransactionRollbackOnly() method.

Other important points

1) Since SBO’s are repository independed do not hard code the repository names in the methods. Either pass the repository name as method parameter or have it as a variable in SBO and use a setter method to populate it after instantiating

2) Always try to make SBO’s stateless (Its a pain to manage state full SBO’s ).

3) Don’t reuse SBO, Always create a new instance before a operation.

Now lets see how to code our ZipSetterSBO

Click on New –> Class, Click on the Browse button of Superclass and Search and Select DfService and in the Interfaces search for the Interface created in the previous step and Click OK. Also select the option Inherited Abstract Methods in Which method stubs would you like to create.

image

I had Overriden method getVersion() for the illustration purpose. See the code sample for the inline comments.

package com.ajithp.studynotes.sbo.impl;

import com.ajithp.studynotes.sbo.IZipValidatorSBO;

import com.documentum.fc.client.DfService;

import com.documentum.fc.client.IDfSession;

import com.documentum.fc.client.IDfSysObject;

import com.documentum.fc.common.DfException;

public class ZipValidator extends DfService implements IZipValidatorSBO {

public static final String versionString = "1.0";

// overriding the default 

public String getVersion() {

return versionString ;

  }

public void validateZipCode (IDfSysObject obj, String zipCode, String repository) throws DfException {

     IDfSession session = getSession(repository);

     try {

     if (isValidUSZipcode(zipCode)){

         obj.setString("zipcode",zipCode);

         obj.save();

 }

     } catch (Exception e){

 /* Assuming that transaction is handled outside the code and this says DFC to abort the transaction

  in case of any error */

 getSessionManager().setTransactionRollbackOnly();

 throw new DfException();

     } finally {

 releaseSession(session);

    }

  }

private boolean isValidUSZipcode(String zipCode){

// implement your logic to validate zipcode. 

// or even call a external webservice to do that 

 // returning true for all zip codes

 return true;

  }

}
Step 3 : Generate Jar files and Create Jar Definitions

The next  step in SBO creation is to create Jar files which will hold the interface and the implementation classes. These jar files are required to deploy your SBO.

Use Composers/Eclipse Create Jar option or command line jar command to create the jar file

image image

image

Selecting the sbo package to create the interface jar

image

Selecting the com.ajithp.studynotes.sbo.impl for implementation.

Look at the Composers Export Jar screenshots for Interface and implementation (Refer Eclipse Documentation for more details). I think the figures posted above are self explanatory.

The Command line to create a Jar file is jar cf <name_of_jar> Please look at the Java Documentation for more details on switches and options  of Jar command.

The creation of Jar Definitions are new step added in Composer.

1) In Composer change the perspective to Documentum Artifacts Click New –> Other –> Documentum Artifacts –> Jar Definition

image

2) Click Next  and Enter the name of for the Jar Definition and click Finishimage

3) Select Type as Interface if the jar has only interface , Implementation if the jar has only implementation of interface or Interface and Implementation if the single jar file has both interface and implementation. click on the Browse button and browse to the jar created in the last step.

In Our case create two Jar Definitions The first one with type as Interface pointing to Jar Created for SBO and second one with type Implementation pointing to the implementation jar

untitled

Name the Interface jar def as zipcodevalidator and the implementation jardef as zipcodevalidatorimpl

Step 4 : Create a Module and Deploy the SBO

In Composer change the perspective to Documentum Artifacts then Click New –> Other –> Documentum Artifacts –> Module

image

Give a valid name and leave the default folder and Click Finishimage

In the Module edit window select SBO from the dropdown

image

Now Click on Add Section of Implementation Jars of Core Jars. A new pop up window will appear which will have list of all the Jar definitions set to Type Implementation and Interface and Implementation. Select the one you wanted to use for ZipCodeValidatorSBO that is ZipCodeValidatorImpl.

image

Click on the Select Button near pointing to Class name and Select the implementation class. In this case ZipValidator

image

Now Click on Add Section of Interface Jars of Core Jars. A new pop up window will appear which will have list of all the Jar definitions set to Type Interfaces and Interface and Implementation. Select the one you wanted to use for ZipCodeValidatorSBO that is ZipCodeValidator.

image

For more details of other options refer to Documentum Composer Manual. Save the Module.

Now right click on the project and install the Documentum project

image

Click on the Login button after logged in Click on Finish to start the installation.

image

 

Look at the Documentum composer documentation to know more about the Installation options.

How to use SBO from a Client Application

follow the below steps to instantiate a SBO from a client application.

1) Get the Local client

2) Create a login info and populate the login credentials.

3) Create a IDfSessionManager object

4) Use the newService () from the Client Object to create a SBO instance

// create client

  IDfClient myClient = DfClient.getLocalClient();

  // create login info

  IDfLoginInfo myLoginInfo = new DfLoginInfo();

  myLoginInfo.setUser("user");

  myLoginInfo.setPassword("pwd");

  // create session manager

  IDfSessionManager mySessionManager = myClient.newSessionManager();

  mySessionManager.setIdentity("repositoryName", myLoginInfo);

  // instantiate the SBO

  IZipValidatorSBO zipValidator = (IZipValidatorSBO) myClient.newService( IZipValidatorSBO.class.getName(), mySessionManager);

  // call the SBO service

  zipValidator.validateZipCode(obj, zipCode, "repositoryName");    

Download this Study Note (PDF)

Posted in BOF, DFC, Documentum | Tagged: , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

Using Java reflection to reduce Code and Development time in DFS

Posted by Ajith Prabhakar on March 11, 2009


Java reflections is one of the most powerful API’s of Java Language, this can be used to reduce code significantly.

Most of the Current Enterprise application consists of different layers and they uses Value objects to transfer data from one layer to another. Inefficient way of using getters and setters of the attributes of Value objects can increase code and development time of application. Effective use of reflection can reduce code and development time significantly.

So lets take a Scenario,  I have a Object type MyObjectType extending from dm_document with 50 additional attributes, so dm_document as of Documentum 6.5 has 86 attributes adding additional 50 attributes that means we have 139 attributes for this object type. Consider a standard Web Application using DFS behind which needs to manipulate (add or edit) instances of this object type, The Service needs to add all these attributes to the PropertySet  of the DataObject representing that instance. Then need to call the appropriate service.

Considering that the bean instance name of MyObjectType is myObjectBean the Standard code will  be something like this


          ObjectIdentity objIdentity = new ObjectIdentity("myRepository");
          DataObject dataObject = new DataObject(objIdentity, "dm_document");
          PropertySet properties = dataObject.getProperties();
          properties.set("object_name", myObjectBean.getObject_Name());
          properties.set("title", myObjectBean.getTitle());

………

objectService.create(new DataPackage(dataObject), operationOptions);

 

In the above code you have to explicitly set individual attributes for the object , the more the number of attributes the more complex and messy code.

Take another Example, where you have to retrieve an Object information and pass it over to the UI layer.


 myObjectBean.setObject_name(properties.get("object_name").getValueAsString());
 myObjectBean.setTitle(properties.get("title").getValueAsString());
 myObjectBean.setMy_Custom_Property(properties.get("my_custom_property").getValueAsString());

This operation can be more complex if you decide to use match the Data Type of your bean with the Object type.

 

So what is the best approach to reduce this complexity? the answer is effective use of reflection API.

Lets take a step to step approach to handle this issue.

 

To understand this better consider the below as the attributes of mycustomobjecttype

 

Attribute Name Attribute Type
first_name String
last_name String
age integer
date_purchased time
amount_due double
local_buyer boolean

 

Java Bean

Create a Java Bean that matches the Object Type


public class Mycustomobjecttype {
  protected String first_name ;
  protected String last_name  ;
  protected int age;
  protected Date date_purchased  ;
  protected double amount_due  ;
  protected boolean local_buyer ;
  public int getAge() {
    return age;
  }
  public void setAge(int age) {
    this.age = age;
  }
  public double getAmount_due() {
    return amount_due;
  }
  public void setAmount_due(double amount_due) {
    this.amount_due = amount_due;
  }
  public Date getDate_purchased() {
    return date_purchased;
  }
  public void setDate_purchased(Date date_purchased) {
    this.date_purchased = date_purchased;
  }
  public String getFirst_name() {
    return first_name;
  }
  public void setFirst_name(String first_name) {
    this.first_name = first_name;
  }
  public String getLast_name() {
    return last_name;
  }
  public void setLast_name(String last_name) {
    this.last_name = last_name;
  }
  public boolean isLocal_buyer() {
    return local_buyer;
  }
  public void setLocal_buyer(boolean local_buyer) {
    this.local_buyer = local_buyer;
  }
}
 

Getting the Values from PropertySet (Loading Java Bean)

……


List<DataObject> dataObjectList = dataPackage.getDataObjects();
DataObject dObject = dataObjectList.get(0);
Mycustomobjecttype myCustomObject = new Mycustomobjecttype();
populateBeanFromPropertySet(dObject.getProperties(),myCustomObject);

……

// See the Reflection in Action here 

public void populateBeanFromPropertySet(PropertySet propertySet, Object bean)
  throws Exception {

BeanInfo beaninformation;
beaninformation = Introspector.getBeanInfo(bean.getClass());
  PropertyDescriptor[] sourceDescriptors = beaninformation.getPropertyDescriptors();
  for (PropertyDescriptor descriptor : sourceDescriptors) {

    Object result = null;
    String name = descriptor.getName();
    if (!name.equals("class")) {
      if (propertySet.get(name) != null) {
        if (descriptor.getPropertyType().getName().equals(
            "int")) {
          result = new Integer(propertySet.get(name)
              .getValueAsString());
        } else if (descriptor.getPropertyType().getName().equals("double")) {
          result = new Double(propertySet.get(name).getValueAsString());

        } else if (descriptor.getPropertyType().getName().equals("boolean")) {
          result = new Boolean(propertySet.get(name).getValueAsString());

        } else if (descriptor.getPropertyType().getName().equals("java.util.Date")) {
          DateProperty dat = (DateProperty)propertySet.get(name);
          result = dat.getValue();
        }else {
          // none of the other possible types, so assume it as String
          result = propertySet.get(name).getValueAsString();
        }
        if (result != null)
          descriptor.getWriteMethod().invoke(bean, result);
      }
    
    }
  }
}
  

Setting Values to Property Set

 


public DataPackage createContentLessObject(Mycustomobjecttype myCustomType) throws Exception {
ObjectIdentity objectIdentity = new ObjectIdentity("testRepositoryName");
DataObject dataObject = new DataObject(objectIdentity, myCustomType.getClass().getName());

PropertySet properties = populateProperties(myCustomType);
properties.set("object_name",myCustomType.getFirst_name()+myCustomType.getLast_name() );
dataObject.setProperties(properties);

DataPackage dataPackage = new DataPackage(dataObject);
OperationOptions operationOptions = new OperationOptions();

return objectService.create(dataPackage, operationOptions);

}

 


// Reflection in Action  
public PropertySet populateProperties(Object bean)throws Exception {
BeanInfo beaninfo;
PropertySet myPropertyset = new PropertySet();
  beaninfo = Introspector.getBeanInfo(bean.getClass());
  PropertyDescriptor[] sourceDescriptors = beaninfo
      .getPropertyDescriptors();
  for (PropertyDescriptor descriptor : sourceDescriptors) {
    String propertyName = descriptor.getName();
    if (!propertyName.equals("class")) {
        // dont set read only attributes if any
        // example r_object_id 
      if (!propertyName.startsWith("r")) {
        Object value = descriptor.getReadMethod().invoke(bean);
        if (value != null) {
          myPropertyset.set(propertyName, value);
        }

      }

    }

  }
  return myPropertyset;
}

Posted in DFS, Documentum | Tagged: , , , , , , , , , , , , , , , , | 1 Comment »

Chaining of Custom Services in DFS

Posted by Ajith Prabhakar on January 25, 2009


 

There is an interesting drawback in Documentum Foundation Services Version 6.5,

Issue:

When you chain custom services and try to build the Services the build fails lets see a Scenario from the DFS sample code itself

@DfsPojoService(targetNamespace = http://common.samples.services.emc.com&#8221;, requiresAuthentication = true

) public class HelloWorldService

{

public String sayHello(String name)

{

ServiceFactory serviceFactory = ServiceFactory.getInstance();

IServiceContext context = ContextFactory.getInstance().getContext();

try {

IAcmeCustomService secondService = serviceFactory.getService(IAcmeCustomService.class, context);

secondService.testExceptionHandling();

} catch (ServiceInvocationException e) {

e.printStackTrace();

} catch (CustomException e) {

e.printStackTrace();

} catch (ServiceException e) {

e.printStackTrace();

}

return “Hello “ + name;

}

}

Here in the sample code of DFS I am chaining the services, Here everything looks fine and when you now you build this service during the genarateArtifacts ant task the Build will fail with a will get a ClassNotFound compiler error at

IAcmeCustomService secondService = serviceFactory.getService(IAcmeCustomService.class, context);

What happens here is when the build does the initial clean up all the generated Client interfaces are deleted and DFS currently not checking for any dependencies.

Let me take the example of dfs-build.xml that’s the part of CoreDocumentumProject in composer

<generateArtifacts serviceModel=“${gen.src.dir}/${context.root}-${module.name}-service-model.xml” destdir=“${gen.src.dir}/”>

<src location=“${src.dir}” />

<classpath>

<path refid=“projectclasspath.path” />

</classpath>

</generateArtifacts>

</target>

 

In this we cannot set any exclusion path in <src location=“${src.dir}” />

Simply because it even if you provide <fileset/> or <direst/> with pattern set its not recognizing it.

I had raised a support case with EMC and they told me that this is not currently supported!!!! And they will add this as a feature request

This means we cannot Chain Custom Services unless EMC fix this or we do a semi manual workaround to overcome this issue.

The Work-around that I found

Follow these steps to overcome this issue

Step 1,

Identify the Services those will call the custom services, and create a new source directory for it in composer, here I am calling them as depended_src and move the services that calls the custom services to there, the depended src should be in a separate path than the webservices- src

src-img1

Step 2

1) Now Edit the Build file and add these two properties

 

<property name=“my.core.services.classes” value=“${service.projectdir}/Web Services/bin/classes” />

 

<property name=“dep.src.dir” value=“${service.projectdir}/depended_src” />

The dep.src.dir should point to the depended src location mentioned in step 1

2) Create an additional target for generatemodel and generate artifacts

<target name=“generateDependencies” depends=“generate”>

<echo message=“Calling generateDependencies” />

<generateModel contextRoot=“${context.root}” moduleName=“${module.name}” destdir=“${gen.src.dir}/”>

<services>

<fileset dir=“${dep.src.dir}”>

<include name=“**/*.java” />

</fileset>

</services>

<classpath>

<pathelement location=“${my.core.services.classes}” />

<path refid=“projectclasspath.path” />

</classpath>

</generateModel>

<generateArtifacts serviceModel=“${gen.src.dir}/${context.root}-${module.name}-service-model.xml” destdir=“${gen.src.dir}/”>

<src location=“${dep.src.dir}” />

<classpath>

<pathelement location=“${my.core.services.classes}”/>

<path refid=“projectclasspath.path” />

</classpath>

</generateArtifacts>

<!– signal build is done –>

<!– used by DFSBuilder.java –>

<copy todir=“${src.dir}/../” file=“${basedir}/dfs-builddone.flag” />

</target>

3) Now edit dfs-build.properteis and add the following property

service.projectdir= <absolute path to the project>

Step 3

1) Run the generate task,

2) Copy all the service entries from (between <module> and </module><context-root>-<module-name>-service-model.xml you can find this in <project_dir>\Web Services\bin\gen-src folder

3) Now run the generateDependencies task that was created on Step 2

4) Now Edit <context-root>-<module-name>-service-model.xml and add the copied services to this file

5) If you want to create the jar files now you can call the package task after this.

This should help you to chain custom services , and if you found any alternate ways please comment.

 

Posted in DFS, Documentum | Tagged: , , , , , , , , | Leave a Comment »

Data Dictionary in Documentum

Posted by Ajith Prabhakar on October 14, 2008


As you all know Documentum is all about Objects and its attributes, Have you ever wondered where Documentum is storing information about its Object types and its attributes?

To get the answer for this question you should know more about Data Dictionary in Documentum. Let me walk you through some of the important aspects of Data Dictionary in this study note. 

What is the use of Data Dictionary

The following is a partial collection of the information about an Object type that will be stored by Data Dictionary Attribute label, help and other information (Localized), Default Attribute values, Value Assistance, Value Mapping, Constrains, and Default lifecycle of that object type. A Documentum client application can leverage this information to build the Presentation layer for that Object type and also provide some business rule enforcements.

 

Another interesting point about Data Dictionary is that it supports multiple locales, which means you can have multiple locale configured for each object type. Each locale represents a geographical region.

Imagine an Organization that has offices in Paris, Spain and US , The Data Dictionary allows you to store each  Attribute label information in all three languages and  the Client application (e.g. : WebTop , or a Custom UI) can fetch for the specific labels in the language of that region and display it to the user.

More about Data Dictionary

Let’s see some of the useful features of Data Dictionary here, above I mentioned some of the information that you can store in the Data Dictionary, lets see some of these in detail for a better understanding. First we will see all UI related and then we will see some business rules and functionalities that you can enforce through data dictionary

 

UI Related

1) Default Values for Attributes

During the Creation of an Object type or upon its modification you can specify the default value of that Attribute, which means if no value for that attribute has been specified by the user this default value will be set as the attribute value

2) Value Assistance

Value assistance is used to provide user with a drop down list of possible values for that attribute. This can be even conditional, means upon selected criteria the values in the value assistance can be changed (Conditional Value Assistance)

Another important point about Value assistance is the values used for Value Assistance can be a fixed list or based out of a DQL query that runs dynamically.

3) Value Mapping

Value mapping is another useful feature where a value can be mapped for another, this works as a Key Value pair, for an example consider this list used for Value mapping New Jersey – NJ, New York – NY, and New Hampshire – NH.

This option provides a possibility of user being displayed with the Complete State name in the UI and Value stored will be just the state code.

4) Internationalization of Various Texts

If you look at the WebTop or Any Documentum UI application (Not necessarily a custom build User Interface) you can see there are lot of information like labels, error messages, help information etc, these text bits can be stored for different locales in the Data Dictionary. Different locale means different languages. So this helps to build a single UI for a global application and support multiple languages.

 

Business Rule and Functionality

1) Lifecycles for an object type.

During the Creation or modification of an object type you can specify a lifecycle as the default lifecycle of an object type. This eliminates the pain of searching for a lifecycle name or its ID to attach it to the newly created object; User can do it by using keyword default at the time of attaching a new Object instance to a life cycle. 
But the important point to note here is just by specifying a default lifecycle a object will not be attached to its default lifecycle. The Creator or the application has to specifically attach that object to the lifecycle.  

2) Constrains

You can do validation of a property by adding constrains to it. The possible types of constrains are the following. Important point to note here is Content server does not enforce these constrains even though you define in Data Dictionary. Typically the Client application should read these constrains and enforces it.  You can also specify the localized error messages in for the validation error in the Data Dictionary. 

a) Primary key
Primary key should be added in combination with not- null constrain. Primary keys are inherited. One or more attributes can make primary key but only single value properties can be a part of it. One object type can have only one primary key definition, (But can have more if it inherit primary key from its super type). Primary key constraints can be either the object type level or the property level. If the key has more than one participating properties it should be defined at type level. If the key is a single property then it’s a good idea to define it at property level.

b) Unique Key
Unique key constraint is used to enforce a property or combination of properties for which all the object of that type should have unique value. The key can be a combination of one or more

single-valued properties or one or more repeating properties which is defined in that object type itself (Not Inherited).Another important point is the key for Unique constraint cannot be a combination of single-valued and repeating properties. These are inherited too.

c) Foreign key
Foreign key constraint identifies relationship between one or more properties for one object type and one or more properties in another. The number and data types of the properties in each set of properties must match. Foreign key constraints can be at object type level or at the property level. It should be defined at type level If the key has two or more participating properties. Also both object types must be set in the same repository, and corresponding parent and child Properties should be of same data type.

d) Not Null
A NOT NULL constraint sets on a property that will not allow having a null value. It can be defined only at the property level and only for single properties

e) Check
Check constraints are used for validating data. An expression or script can be provided in the constraint’s definition that the client application can run to validate a given property’s value. This can be on Object level or Attribute level

 

How is Data Dictionary modified

Data Dictionary modification can be either adding a new Object type information or can be modifying existing Object type information.

For adding a new Object type and its any of the above mentioned details can be done by either calling CREATE TYPE DQL Script, or by creating a new type  in a new or Existing DAR and deploying it.

Modifying the Existing Object type information can be done by editing type information in the doc app or DAR or by calling Alter type DQL script.

Please note that DAR or Documentum Archive is applicable only for repositories those are running on Documentum 6 or higher

 

Data Dictionary Publishing Job

When you update the Data Dictionary it in essence updates the internal object types and you need to run Data Dictionary Publishing job. This job is responsible for creating the necessary dmi_dd_attr_info , dmi_dd_type_info  and dmi_dd_common_info objects. You can configure and run this job from Documentum Administrator.

 

What makes a Data Dictionary

I had mentioned three object types that are getting created when the Data Dictionary publishing job publishes the Data Dictionary information. Lets see those objects briefly here  

dmi_dd_common_info
This object type contains information about an object type or an attribute that are common. All the objects of this type will have r_object_id starting with 68.

dmi_dd_type_info
This is a sub type of dmi_dd_common_info. This object type contains information about an object type. (which has already been published to the data dictionary) All the objects of this type will have r_object_id starting with 69.

dmi_dd_attr_info
This is a sub type of dmi_dd_common_info. This object type has  information about a property (which is already published to the data dictionary) All the objects of this type will have  r_object_id starting with 6a.

You would have noticed that all these object type starts with keyword dmi this means we cannot create or modify this object type, Only Data Dictionary publishing job can modify or create this type. 

Download this Study Note (PDF)

Posted in Content Server, Documentum, Genaral | Tagged: , , , , , , , , | Leave a Comment »

Aliases and Alias sets in Documentum

Posted by Ajith Prabhakar on August 22, 2008


 

In Simple words Aliases are placeholders that can hold any of the following

1)      A User or a Group Name

2)      Folder information

Alias is a key value pair where key is the alias name and value is the actual value.

 

Alias Sets are the Collection of Aliases (Which has alias names and its values)

 

Typical uses of Aliases

1)      Dynamically Resolve task performers in a Workflow

2)      Set up ACL, ACL Domain and Owner name in a Sys Object or its sub types.

3)      Dynamically link or unlink an object of Sys object or its subtype with a Folder path

4)      Also used in Template ACL’s (I will explain more about Template ACL in another study note soon)

 

Internals of an Alias and Alias sets

Documentum uses an Object type named dm_aliase_set to store the aliases and its values. All the objects of this type will have a r_object_id starting with “66”

Lets see the attributes of this object type 

 

 

Name

Info

Description

alias_category

Integer (Repeating)

This defines the category of the aliases value in the corresponding index.   These are the possible values and its corresponding categories

Value

Category

0

Unknown

1

User

2

Group

3

User or Group

4

Path of Cabinet

5

Path of Folder

6

Name of ACL

alias_description

String (255) (Repeating)

This defines the option description for individual alias values that correspond to the index in alias names.

alias_names

String (32) (Repeating)

The name of alias for the corresponding index in the alias values. (This Sting cannot have (.))

alias_usr_ category

Integer (Repeating)

Placeholder for defining user defined categories for alias values.

alias_value                      

String (255) (Repeating)

The corresponding values for the Aliases defined the corresponding index in the alias names]

object_name

String (32) (Single)

Name of the Alias set

  • Has to be Unique among the alias sets in repository

object_description

String (128 ) (Single)

Description of the alias sets

owner_name

String (32) (Single)

The name of the user who owns this alias sets.

 

So now you have seen what makes a dm_alias_set object, now lets clarify how an Alias is defined within an Alias set.

Alias_names stores the name of the Alias and Alias_value stores the corresponding value for the Alias and it’s stored as repeatable attribute. Means value of an index position defines an Alias Category, Alias Name and Alias Value

 

In an example of an Alias manager = John Smith the index of listing manager in attribute alias_name and index of listing John Smith in attribute alias_value will be same

 

Alias Reference and Scope of Alias

 

Alias is referenced used %alias_name

Referencing an Alias can also include Object_name of Alias Set; in that case the reference will be %alias_set_object_name.alias_name

 

So if the alias is referenced with the Alias set names (%alias_set_object_name.alias_name

) Server finds the Alias set by name and picks up the Alias name and resolve its Alias value, this is pretty much a straightforward job for the server to do.

 

But where the Alias name is referred without Alias set name the Server searches for the Alias in some specific scopes, This Order and location where the search is made for alias depends upon where that alias is referenced.

 

The following table describes few important Scopes, underlying object type and attribute of that object that helps server to identify the alias set to resolve the alias names.

 

Scope

Object type

Attribute

Notes

User

dm_user

alias_set_id

Server user alias_set_id property of the user who has done the action that resulted in alias resolution.

Group

Dm_group

alias_set_id

Default group of the user who performed the action

Lifecycle

Dm_sysobject

R_alias_set_id

R_alias_set_id is set by the server when the object is attached to a lifecycle.

Server Configuration

Dm_server_config

Alias_set_id

This alias set object represents the alias set thats used as the default system-level

Alias set.

 

Posted in Documentum, Genaral | Tagged: , , , , , , , , , , , , | Leave a Comment »

Documentum Object types Naming Convention

Posted by Ajith Prabhakar on July 1, 2008


All Out of the box Documentum object types follows the following naming convention

  • All object types that are commonly used and visible to users starts with letters dm (Example dm_document, dm_sysobject, dm_user )
  • All Object types that are dynamically cached (means changes to it is visible to applications and users) starts with letters dmc  (Example dmc_completed_workitem dmc_jar dmc_java_library )
  • All object types that read only will start with letters dmr There are only object types that starts with letters dmr they are  dmr_content and dmr_containment
  • All Object types that are internally used by Content server starts with the letters dmi (Example dmi_package, dmi_queue_item, dmi_session)

Posted in Content Server, Genaral | Tagged: , , , , , , , , , , , | 2 Comments »

Immutable Objects in Documentum

Posted by Ajith Prabhakar on July 1, 2008


Immutable objects are those objects, which cannot be changed. That means (most of the) properties or contents of these objects cannot be edited.  I had discussed few points about immutability on my notes about Virtual Documents. Lets see some other aspects of Immutability here.

 Now lets see how can we make an object immutable and what are the exception to it

  1. Versioning an Object (Sys object or any sub type of Sys object)
    When an Object is versioned (made a new version by check in) the old version of the versioned object becomes immutable
  2. Branching an Object
    When you branch an Object the parent of the new Branched object becomes Immutable.In both the cases mentioned above Immutability of an object is controlled by an attribute with Boolean value called r_immutable_flag, which is part of dm_sysobject . Content server sets the r_immutable_flag to true on the old version of the object in both these cases.
     
  3. Retention Policy
    If the Object is governed by a retention policy that makes its immutable then also the r_immutable_flag to true making the Object immutable.
  4. Freezing a Document by calling freeze()
    Freeze method is used to explicitly set a object as immutable. When you call freeze method it sets r_immutable_flag to true and also sets r_frozen_flag to true.
    Setting r_frozen_flag to true indicates that freeze method is called on that object. In other words r_frozen_flag is set to true only if freeze method is called on that object, not by versioning branching etc.
    Unfreezing a Document
    Unfreeze
    method sets the value of r_frozen_flag to false and also r_immutable_flag to false and hence makes that object mutable again

Freezing and Unfreezing a Virtual Document
If you chose to freeze an associated snapshot in a Virtual Document, the r_has_frzn_assembly attribute is also set to TRUE on that Virtual Document object.

When you freeze a snapshot r_immutable_flag attributes for each component in the snapshot is set to true and also value of r_frzn_assembly_cnt is incremented. The r_frzn_assembly count attribute contains a count of the number of frozen snapshots that contain this component.

If value of r_frzn_assembly_cnt is greater than zero, you cannot delete or modify the object.

Calling a Freeze method to freeze a snapshot automatically freezes the document with which the snapshot is associated. To freeze only the snapshot and not the document, first execute a Freeze method and include the argument to freeze the snapshot then execute an Unfreeze method to unfreeze only the document.

Exceptions

Even though if an Objects is set to immutable there are some attributes of that object that can be changed, lets see what all are those attributes are

 These are the attributes that content server can change on a immutable Objects

  • a_archive 
  • i_isdeleted 
  • i_reference_cnt 
  • i_vstamp
  • r_access_date 
  • r_alias_set_id 
  • r_current_state
  • r_resume_stat
  • r_frozen_flag
  • r_frzn_assembly_cnt
  • r_immutable_flag
  • r_policy_id

There are few attributes that an application or a DQL can change on a Frozen Object

  • i_folder_id
  • a_special_app
  • a_compound_architecture
  • a_full_text
  • a_storage_type
  • Version label (r_version_label) Symbolic labels only
  • acl_domain
  • acl_name
  • owner_name
  • group_name
  • owner_permit
  • group_permit
  • world_permit

Changing i_folder_id means you can link or unlink a Frozen document with any folders or cabinets)

 Download this Study Note (PDF)

Posted in Content Server, Documentum, Genaral | Tagged: , , , , , , , , , , , , | Leave a Comment »

Difference between Super User and Sysadmin

Posted by Ajith Prabhakar on June 23, 2008


Most of the budding Documentum developers get confused about the differences between a Super User and a Sysadmin. Though I had mentioned these points on my Documentum Security notes PDF file, I feel this needs a separate entry here. So just listing few important privileges of Sysadmin and Super User here. This list is not exhaustive but I guess I have most of it here, if you feel that I missed some important please feel free to add it as a comment.

Sysadmin

  • Create, alter, and drop users and groups
  • Create, modify, and delete system-level ACLs
  • Grant and revoke Create Type, Create Cabinet, and Create Group privileges
  • Create types, cabinets, and printers
  • Manipulate workflows or work items, regardless of ownership
  • Manage any object’s lifecycle
  • Set the a_full_text attribute

 The Sysadmin privilege does not override object-level permissions

 Super User

  • Perform all the functions of a user with Sysadmin privileges
  • Unlock objects in the repository
  • Modify or drop another user’s user-defined object type
  • Create subtypes that have no supertype
  • Register and unregister another user’s tables
  • Select from any underlying RDBMS table regardless of whether it is registered or not
  • Modify or remove another user’s groups or private ACLs
  • Create, modify, or remove system ACLs
  • Grant and revoke Superuser and Sysadmin privileges
  • Grant and revoke extended privileges
  • View audit trail entries

Download this Study Note (PDF)

Posted in Admin, Documentum, Genaral | Tagged: , , , , , , , , , , | Leave a Comment »

Federation in Documentum

Posted by Ajith Prabhakar on June 17, 2008


Federation is one among the most common distributed Documentum model. This means multiple Documentum repositories run as a federation. There will be a Governing repository and multiple member repositories in this model. Lets try to find out more about Federation

 

Take this typical scenario A Major Pharmaceutical Company ABC Corporation has multiple research centers and production plants across the glob and they have multiple Documentum repositories used for storing various information. A user logged into a corporate application needs to fetch documents from these various repositories in a Single session. Each repository in this scenario should have same set of users, groups and ACL for this architecture to work, manually managing these kind of scenario is trouble some and error prone.

 

Now lets see what a federation can do to make it less complex.

As I mentioned above Federations consists of Governing and Member repositories all the changes that has been made to global users and groups and external ACLS in the governing repository are automatically reproduced in the member Repository.

 

Requirements for Federation

·         Object types definition should be same in the all participating repositories.

·         User and group definition should be same in all participating repositories.

·         The server on which governing repository runs must project to the connection brokers at the servers where member repository runs

·         The server on which member repositories runs must project to the connection brokers at the servers where governing repository runs

·         If any of the participating Content Servers are with trusted server licenses Either
The servers should be configured to listen on both secure and native port or
The secure connection default for clients allows the clients to request a connection on a native or secure port

 

Few Bullet points about Federation

·         Any alteration done to any of the object type will not be automatically pushed to the participating repositories

·         Only users or groups marked as Global while creating them will be pushed / synchronized with participating repositories

·         The users those are part of any object types that are extended from dm_user will not automatically pushed. This will happen only if you specify this type in the Federation configuration.

·         Each repositories can be part of a single federation

·         A federation may contain different Content Server versions

·         A federation may contain a mix of trusted and non-trusted Content Servers.

 

 Download this Study Note (PDF)

Posted in Content Server, Documentum, Genaral | Tagged: , , , , , , , , , , , , , | 1 Comment »

 
Follow

Get every new post delivered to your Inbox.

Join 66 other followers