Introducing Java Code Generator 1.0 A Utility to generate Java Beans from Documentum Objects


Java Code Generator generates Java classes from Documentum Object types. Few bullet points about what this utility does

  • Generates Java Classes from the Documentum Object types
  • All non-inherited Attributes will be member variables of the Generated Java Class
  • An array of the Object type for Repeated Attributes.
  • Class name by default will be capitalized name of the underlying Documentum object type
  • Option to prefix and suffix class name
  • Option to specify the Package name
  • Supports DFS Annotation

After a couple of Beta versions finally, I am glad to announce The Java code Generator. Thanks a lot for all who tried this and send the valuable feedbacks to me. I tried to incorporate most of the suggestions and fix many of the bugs in this version

 I have added a new DFC version of this tool to the download page.

Click here to Go to Downloads page

Using Java reflection to reduce Code and Development time in DFS


 

Java reflections are one of the most powerful API’s of Java Language, this can be used to reduce code significantly.

Most of the Current Enterprise application consists of different layers and they use Value objects to transfer data from one layer to another. An inefficient way of using getters and setters of the attributes of Value objects can increase code and development time of application. Effective use of reflection can reduce code and development time significantly.

So let’s take a Scenario,  I have an Object type MyObjectType extending from dm_document with 50 additional attributes, so dm_document as of Documentum 6.5 has 86 attributes adding additional 50 attributes that means we have 139 attributes for this object type. Consider a standard Web Application using DFS behind which needs to manipulate (add or edit) instances of this object type, The Service needs to add all these attributes to the PropertySet  of the DataObject representing that instance. Then need to call the appropriate service.

 

Considering that the bean instance name of MyObjectType is myObjectBean the Standard code will  be something like this

  ObjectIdentity objIdentity = new ObjectIdentity("myRepository");
  DataObject dataObject = new DataObject(objIdentity, "dm_document");
  PropertySet properties = dataObject.getProperties();
  properties.set("object_name", myObjectBean.getObject_Name());
  properties.set("title", myObjectBean.getTitle()); 
  // omited for simplicity


  objectService.create(new DataPackage(dataObject), operationOptions);

 

In the above code you have to explicitly set individual attributes for the object, the more the number of attributes the more complex and messy code.

Take another Example, where you have to retrieve an Object information and pass it over to the UI layer.

 myObjectBean.setObject_name(properties.get("object_name").getValueAsString());
 myObjectBean.setTitle(properties.get("title").getValueAsString());
 myObjectBean.setMy_Custom_Property(properties.get("my_custom_property").getValueAsString());

This operation can be more complex if you decide to use match the Data Type of your bean with the Object type.

 

So what is the best approach to reducing this complexity? the answer is the effective use of reflection API.

Let’s take a step to step approach to handle this issue.

To understand this better consider the below as the attributes of mycustomobjecttype

 

Attribute Name Attribute Type
first_name String
last_name String
age integer
date_purchased time
amount_due double
local_buyer boolean

 

Java Bean

Create a Java Bean that matches the Object Type

 public class Mycustomobjecttype {
  protected String first_name ;
  protected String last_name  ;
  protected int age;
  protected Date date_purchased  ;
  protected double amount_due  ;
  protected boolean local_buyer ;
  public int getAge() {
    return age;
  }
  public void setAge(int age) {
    this.age = age;
  }
  public double getAmount_due() {
    return amount_due;
  }
  public void setAmount_due(double amount_due) {
    this.amount_due = amount_due;
  }
  public Date getDate_purchased() {
    return date_purchased;
  }
  public void setDate_purchased(Date date_purchased) {
    this.date_purchased = date_purchased;
  }
  public String getFirst_name() {
    return first_name;
  }
  public void setFirst_name(String first_name) {
    this.first_name = first_name;
  }
  public String getLast_name() {
    return last_name;
  }
  public void setLast_name(String last_name) {
    this.last_name = last_name;
  }
  public boolean isLocal_buyer() {
    return local_buyer;
  }
  public void setLocal_buyer(boolean local_buyer) {
    this.local_buyer = local_buyer;
  }
}

Getting the Values from PropertySet (Loading Java Bean)

……

List<DataObject> dataObjectList = dataPackage.getDataObjects();
DataObject dObject = dataObjectList.get(0);
Mycustomobjecttype myCustomObject = new Mycustomobjecttype();
populateBeanFromPropertySet(dObject.getProperties(),myCustomObject);

……

// See the Reflection in Action here 
public void populateBeanFromPropertySet(PropertySet propertySet, Object bean)
  throws Exception {
 BeanInfo beaninformation;
 beaninformation = Introspector.getBeanInfo(bean.getClass());
 PropertyDescriptor[] sourceDescriptors = beaninformation.getPropertyDescriptors();
 for (PropertyDescriptor descriptor : sourceDescriptors) {
     Object result = null;
     String name = descriptor.getName();
    if (!name.equals("class")) {
      if (propertySet.get(name) != null) {
        if (descriptor.getPropertyType().getName().equals("int")) {
          result = new Integer(propertySet.get(name)
              .getValueAsString());
        } else if (descriptor.getPropertyType().getName().equals("double")) {
          result = new Double(propertySet.get(name).getValueAsString());
         } else if (descriptor.getPropertyType().getName().equals("boolean")) {
          result = new Boolean(propertySet.get(name).getValueAsString());
         } else if (descriptor.getPropertyType().getName().equals("java.util.Date")) {
          DateProperty dat = (DateProperty)propertySet.get(name);
          result = dat.getValue();
        }else {
          // none of the other possible types, so assume it as String
          result = propertySet.get(name).getValueAsString();
        }
        if (result != null)
          descriptor.getWriteMethod().invoke(bean, result);
      }
     }
  }
}

Setting Values to Property Set

 

public DataPackage createContentLessObject(Mycustomobjecttype myCustomType) throws Exception {
ObjectIdentity objectIdentity = new ObjectIdentity("testRepositoryName");
DataObject dataObject = new DataObject(objectIdentity, myCustomType.getClass().getName());
PropertySet properties = populateProperties(myCustomType);
properties.set("object_name",myCustomType.getFirst_name()+myCustomType.getLast_name() );
dataObject.setProperties(properties);
DataPackage dataPackage = new DataPackage(dataObject);
OperationOptions operationOptions = new OperationOptions();
return objectService.create(dataPackage, operationOptions);
}

 

// Reflection in Action  
public PropertySet populateProperties(Object bean)throws Exception {
BeanInfo beaninfo;
PropertySet myPropertyset = new PropertySet();
beaninfo = Introspector.getBeanInfo(bean.getClass());  
PropertyDescriptor[] sourceDescriptors = beaninfo
      .getPropertyDescriptors();
  for (PropertyDescriptor descriptor : sourceDescriptors) {
    String propertyName = descriptor.getName();
    if (!propertyName.equals("class")) {
        // dont set read only attributes if any
       // example r_object_id 
       if (!propertyName.startsWith("r")) {
        Object value = descriptor.getReadMethod().invoke(bean);
       if (value != null) {
          myPropertyset.set(propertyName, value);
        }
      }
   }
 }
  return myPropertyset;
}

Chaining of Custom Services in DFS


 

There is an interesting drawback in Documentum Foundation Services Version 6.5,

Issue:

When you chain custom services and try to build the Services the build fails lets see a Scenario from the DFS sample code itself

@DfsPojoService(targetNamespace = http://common.samples.services.emc.com&#8221;, requiresAuthentication = true

) public class HelloWorldService

{

public String sayHello(String name)

{

ServiceFactory serviceFactory = ServiceFactory.getInstance();

IServiceContext context = ContextFactory.getInstance().getContext();

try {

IAcmeCustomService secondService = serviceFactory.getService(IAcmeCustomService.class, context);

secondService.testExceptionHandling();

} catch (ServiceInvocationException e) {

e.printStackTrace();

} catch (CustomException e) {

e.printStackTrace();

} catch (ServiceException e) {

e.printStackTrace();

}

return “Hello “ + name;

}

}

Here in the sample code of DFS I am chaining the services, Here everything looks fine and when you now you build this service during the genarateArtifacts ant task the Build will fail with a will get a ClassNotFound compiler error at

IAcmeCustomService secondService = serviceFactory.getService(IAcmeCustomService.class, context);

What happens here is when the build does the initial clean up all the generated Client interfaces are deleted and DFS currently not checking for any dependencies.

Let me take the example of dfs-build.xml that’s the part of CoreDocumentumProject in composer

<generateArtifacts serviceModel=“${gen.src.dir}/${context.root}-${module.name}-service-model.xml” destdir=“${gen.src.dir}/”>

<src location=“${src.dir}” />

<classpath>

<path refid=“projectclasspath.path” />

</classpath>

</generateArtifacts>

</target>

 

In this we cannot set any exclusion path in <src location=“${src.dir}” />

Simply because it even if you provide <fileset/> or <direst/> with pattern set its not recognizing it.

I had raised a support case with EMC and they told me that this is not currently supported!!!! And they will add this as a feature request

This means we cannot Chain Custom Services unless EMC fix this or we do a semi manual workaround to overcome this issue.

The Work-around that I found

Follow these steps to overcome this issue

Step 1,

Identify the Services those will call the custom services, and create a new source directory for it in composer, here I am calling them as depended_src and move the services that calls the custom services to there, the depended src should be in a separate path than the webservices- src

src-img1

Step 2

1) Now Edit the Build file and add these two properties

 

<property name=“my.core.services.classes” value=“${service.projectdir}/Web Services/bin/classes” />

 

<property name=“dep.src.dir” value=“${service.projectdir}/depended_src” />

The dep.src.dir should point to the depended src location mentioned in step 1

2) Create an additional target for generatemodel and generate artifacts

<target name=“generateDependencies” depends=“generate”>

<echo message=“Calling generateDependencies” />

<generateModel contextRoot=“${context.root}” moduleName=“${module.name}” destdir=“${gen.src.dir}/”>

<services>

<fileset dir=“${dep.src.dir}”>

<include name=“**/*.java” />

</fileset>

</services>

<classpath>

<pathelement location=“${my.core.services.classes}” />

<path refid=“projectclasspath.path” />

</classpath>

</generateModel>

<generateArtifacts serviceModel=“${gen.src.dir}/${context.root}-${module.name}-service-model.xml” destdir=“${gen.src.dir}/”>

<src location=“${dep.src.dir}” />

<classpath>

<pathelement location=“${my.core.services.classes}”/>

<path refid=“projectclasspath.path” />

</classpath>

</generateArtifacts>

<!– signal build is done –>

<!– used by DFSBuilder.java –>

<copy todir=“${src.dir}/../” file=“${basedir}/dfs-builddone.flag” />

</target>

3) Now edit dfs-build.properteis and add the following property

service.projectdir= <absolute path to the project>

Step 3

1) Run the generate task,

2) Copy all the service entries from (between <module> and </module><context-root>-<module-name>-service-model.xml you can find this in <project_dir>\Web Services\bin\gen-src folder

3) Now run the generateDependencies task that was created on Step 2

4) Now Edit <context-root>-<module-name>-service-model.xml and add the copied services to this file

5) If you want to create the jar files now you can call the package task after this.

This should help you to chain custom services , and if you found any alternate ways please comment.

 

Aspects the new BOF type in Documentum


Aspects

Aspects, the new member in the BOF family is one among the new additions to the Documentum 6 (D6).  This short notes are just to give you insight to the fundamentals of Aspects.

What is Aspects?

In simple words Aspects are like TBO’s but they are not associated with any object types.  That means you can attach or detach an Aspect at runtime to any object type. In other words Aspects are a mechanism for adding behavior and/or attributes to a Documentum object instance without changing its type definition (From DFC Study Guide)

Aspects like any other BOF types are stored in the repository. When a DFC runtime requests for a Aspect it will be downloaded to that system and used. Aspect belongs to the type dmc_aspect_type.

 Aspects are saved in /System/Modules/Aspect in the repository which has a global registry. If any changes are made to any aspects DFC runtime detects that and downloads the latest version to its local cache of BOF.  

What Aspects can do?

Aspects are different from other BOF’s because it is attached to the Object instance, not to the object types.  They are not a per-application customization. Any Persistent object instance can have multiple aspects attached to it if each aspect has a unique name.

Aspects define custom additional fields and it can set custom values to the fields of any object that of type persistent object. There are no restrictions for the type of and number of attributes that you can create on an aspect.  It can be of any supported types and be either a single value or a repeated value attributes.

Another aspect can access the Attributes defined in one Aspect. The Fully qualified aspects attributes are aspect_name.aspect_value. When you fetch an Object all values of attributes of attached aspects are also fetched.  If you destroy an object it also deletes the attributes in the attached aspect.

Where Can I use it?

Aspects are used when you have a cross type functionality to be defined. When you create a application infrastructure or a corporate platform Aspects are the best way to implement business functionality which is common between different object types. When you have some functionality that is per instance based then also Aspects is the solution.

Lets see a real time scenario where you can use Aspect.

Lets imagine a scenario where you have system, which has user base of different countries, depending upon the country where a user belongs the behavior and fields of the application changes. You can create different aspect, which has the country specific behaviors and fields, and attach it as needed. If the user changes from one country to another simply remove the old country aspect and attach the new country aspect.

 To be Continued ….

DFS Data Model Explained


Hi All 

As I am exploring the new DFS  I am noting some points that I can share with you guys. I know I cannot explain everything about DFS here As like you I am also learing but I will try to give an overall  picture of What DFS is all about here. Please visit this blog often as I keep updating this.

 

DFS Explained

Documentum Foundation Services the SOA face of Documentum. The remote invocation of DFS is implemented using SOAP based web-services. The beauty of DFS is that though DFS is exposed for the remote usages as Web Services it can be called locally using DFS Client libraries. DFS allows the migration easy by allowing services to be developed on existing BOF (Documentum Business Object Framework) also.Another interesting point about this is DFS consolidates the numerous inter depended methods into a Single Service. The DFS data model is expressed in both Java client library class and also in Service XML Schemas. This provides a consistent approach to model data that will be exchanged between the various business processes.  As mentioned above the DFS clients can be of two types.

  • Applications (Consumers) written any language that can use WSDL to interact with DFS.
  • Clients written on java that uses DFS Java Client library

 Data Model of DFS  The data passed to and from the Services are encapsulated into DFS Object Model  These are the few important object types on DFS

  • DataPackage
  • DataObject
  • ObjectIdentity
  • Property
  • Content
  • Permissions
  • Relationship

 Lets see what all these does now 

DataPackage 

According to the DFS Reference Guide from EMC The DataPackage class defines the fundamental unit of information that contains data passed to and returned by services operating in the DFS framework.  That means DataPackage is a collection of DataObject Instances, which is passed back and forth by Object Service Operations. In other words when you call services like Create, Get, Update the Data Object Instances are passed using DataPackage class.

It’s like an Envelope for putting various DataObject Instances. Object service operations process all the DataObject instances in the DataPackage sequentially.

 

DataObject

A DataObject is the DFS representation of a Persistent Object in Documentum Repository. DataObject potentially has all the information related to an Object in repository. This includes the Content, properties (Both Single and Repeated), relation with other Objects in repository, Its permissions etc. Due the nature of these objects, Data Objects are very complex.
Optionally these object instances may have instructions to services about how the parts of Objects have to be processed.
The type field of DataObject class represents the underlying typed object type name that corresponds to the Object Instance. For example dm_user, dm_document etc. default type is dm_document. I.e. if no type is specified then service implicitly assign that to type dm_document   

 

DataObject

ProperySet
Content
ObjectIdentity
Permission
Relationship

 The above Figure represents a DataObject.

ObjectIdentity

 As the name says this represents a Unique Object in the Repository. An instance of this class must have the repository name and a unique identifier for that object.  The value that represents a repository object may be any of the following  

  • OBJECT_ID – Contains r_object_id of the Object (Represents both Current Object and Non Current Object)
  • OBJECT_PATH – Contains String expression of the path of the Object in repository
    [Cabinet]/[Folder]/[File_Name]
    (This Represents only Current Object. Since you can create multiple objects with same name in same directory this does not guarantee uniqueness of the object)
  • QUALIFICATION – DQL Snippet that qualify an Object uniquely
    (The DQL Snippet is the Full DQL that follows the keyword FROM. E.G. dm_document where r_object_id=’09xxxxx’)

 Note: During the Creation of the object all you need to populate is only repository name.  

Property

Each Property represents an Object Attribute (Property) in the Documentum Repository. Property Set is a collection, which works as a container of multiple Property Objects

As there are 2 types of Properties for an Object a property can either be a Single property (Single Value Attribute) or Array Property (Repeated Attributes).  Property class has been sub classed to accommodate various data types and they are as follows.  

StringProperty

 
NumberProperty  
BooleanProperty  
DateProperty  
ObjectIdProperty  
ArrayProperty (Abstract Class) ®
  StringArrayProperty
  NumberArrayProperty
  BooleanArrayProperty
  DateArrayProperty
  ObjectIdArrayProperty
Transient Property

These properties are not the part of persistent properties of a repository object. Transient Property can send custom data fields to a service to be used for a purpose other than setting attributes on repository objects.

 

Array Property and Value Action

The Repeating Attributes of an Object type is represented as Array Properties. As you can see in the above table The ArrayProperty is an array of the corresponding single property.  Another Interesting aspect of Array Property is Value Action. Now lets see what value action is all about? This is an optional Action – Index mapping pair. These pair has the instruction as to what is the action to be done on a particular element of the ArrayProperty. The Index is the position in the attribute and Action is what is the action to be performed on that element of that index.  The possible ActionTypeValues are  

  • Append
    When processing ValueAction[p], the value at ArrayProperty[p] is appended to the end of repeating properties list of the persistent repository object. The index of the ValueAction item is ignored.
  • Insert
    When processing ValueAction[p], the value at ArrayProperty[p] is inserted into the repeating attribute list before position index. Note that 1, which must be accounted for in subsequent processing, offsets all items in the list to the right of the insertion point.
  • Delete
    The item at position index of the repeating attribute is deleted. When processing ValueAction[p] the value at ArrayProperty[p] must be set to a empty value.
  • Set
    When processing ValueAction[p], the value at ArrayProperty[p] replaces the value in the repeating attribute list at position index. (From DFS Development Guide)

 

Content

 Content Class or its Subclass instance represents the actual File Content associated with a DataObject.  DataObject can have zero or more Content objects. It can hold Renditions also.   Content class object can be configured to hold following

  • The Whole Document

  • Page of a Document

  • Pages (One or More) which has been represented by the characteristic

 

 

Permissions

 A DataObject has list of Permission Objects, which decides the permission of the object that’s represented by the DataObject. The permission list provides read access to the permission on an object in repository by the user who has been logged in currently. 

Also have to note an interesting point here that You Cannot change permissions on a repository object by changing the Permission list and saving the DataObject.

For changing permissions of a repository object the real ACL of that object has to be edited or replaced.   As mentioned above multiple Permission Objects creates Permission List. The Permission Object has a field named permission Type that sets what type of permission is set. The values possible for these fields are BASIC, EXTENDED or CUSTOM   Another important thing about permissions are if you assign one particular permission for one user to one object, all the lower lever permissions below the assigned permissions will be granted to that user. This is known as Compound or Hierarchical Permissions  PermissionType Enum Constants (From DFS Guide) 

PermissionType

Permission Description
Basic NONE No access is permitted
  BROWSE The user can view attribute values of content
  READ The user can read content but not update.
  RELATE The user can attach an annotation to object.
  VERSION The user can version the object.
  WRITE The user can write and update the object.
  DELETE The user can delete the object.
Extended X_CHANGE_LOCATION The user can change move an object from one folder to another. All users having at least Browse permission on an object are granted Change Location permission by default for that object.
  X_CHANGE_OWNER The user can change the owner of the object
  X_CHANGE_PERMIT The user can change the basic permissions on the object.
  X_CHANGE_STATE The user can change the document lifecycle state of the object.
  X_DELETE_OBJECT The user can delete the object. The delete object extended permission is not equivalent to the base Delete permission. Delete Object extended permission does not grant Browse, Read, Relate, Version, or Write permission
  X_EXECUTE_PROC The user can run the external procedure associated with the object. All users having at least Browse permission on an object are granted Execute Procedure permission by default for that object.

 

 

Relations

Relationship allows creating a Single DataObject that specifies all the relations that it has with other objects existing and new. This also allows single service call to get, update or create the whole set of objects and its relationships.  Relationship has two subclasses             
Object Relationship  
           &           
Reference Relationship
These classes define all the relationships that a (any) object in repository can have.  This represents existing relations, also can add new relations, which can be updated in the repository using service calls.  

According to DFS Developer guide The repository defines object relationships using different constructs, including generic relationship types represented by hardcoded strings (folder and virtual_document); dm_relation objects, which contain references to dm_relation_type objects; and dmc_relationship_def objects, a representation provides more sophistication in Documentum 6. The DFS Relationship object provides an abstraction for dealing with various metadata representations in a uniform manner 

 

ObjectRelationship

ObjectRelationship represents relationship to a new or existing repository object. It can be used by the update operation to either update or create Content objects. This when called with update operation modify/ create the target objects 

ReferenceRelationship

A ReferenceRelationship represents a relationship to an existing repository object and is specified using an ObjectIdentity. This object can (only) be used to create a relationship between two objects. This will not create or update the target objects.

 

 

Download this study note (PDF)