Extending EMF ItemProviders using Google Guice – I

May 18, 2011 9 comments

EMF ItemProvider is the most important middleman in EMF framework providing all of the interfaces needed for model objects to be viewed or edited. They adapt EMF objects by providing the following:

1. Content and label providers which enables model objects to be viewed in Eclipse viewers
2. Property source for model objects to make them available in Eclipse Property View
3. Commands for editing model objects
4. Forwarding change notifications on to Eclipse viewers

EMF generates the ItemProviders for you assuming some defaults. Although the defaults work most of time, there are occasions where you would like to change these. You could make the changes directly in generated code and add @generated NOT annotations. However, by doing this you invite trouble from MDSD gurus. Because the principle says, “Don’t touch generated code!”. Ideally you would like to retain the generated ItemProviders as it is and extend them somehow with your changes.

The tutorial shows how to do this in an elegant way by using Dependency Injection (DI). We will use the EMF example of “Extended Library Model” to take us through this.

Setup

1. Create a new Eclipse workspace

2. Add the “EMF Extended Library Model Example” projects using “New Project Wizard”

3. Run the example projects and create a sample Library model

Extending ItemProviders the EMF way

The generated editor by default displays the title of Book in the tree editor. This is because of the default getText() implementation in BookItemProvider.

  /**
   * This returns the label text for the adapted class.
   * <!-- begin-user-doc -->
   * <!-- end-user-doc -->
   * @generated
   */
  @Override
  public String getText(Object object)
  {
    String label = ((Book)object).getTitle();
    return label == null || label.length() == 0 ?
      getString("_UI_Book_type") : //$NON-NLS-1$
      getString("_UI_Book_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

Lets say we want to change this such that the number of pages in the book is also displayed along with its title. You could do this by changing the generated code and adding the @generated NOT annotation.

  /**
   * This returns the label text for the adapted class.
   * <!-- begin-user-doc -->
   * <!-- end-user-doc -->
   * @generated NOT
   */
  @Override
  public String getText(Object object)
  {
    String label = ((Book)object).getTitle() + " (" + ((Book) object).getPages() + ") ";
    return label == null || label.length() == 0 ?
      getString("_UI_Book_type") : //$NON-NLS-1$
      getString("_UI_Book_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

If you now run the editor, the number of pages is displayed together with the book name.

This however breaks the Generation Gap Pattern in code generation.

Extending ItemProviders by extension

The cleaner approach is to extend the generated ItemProvider to include your changes. Now we have to also make sure that the tooling uses our customized ItemProvider. This is rather easy.

EMF ItemProviders are in fact EMF adapters. They provide some behavioural extensions to the modeled objects. The recommended way of creating adapters in EMF is using AdapterFactories. The ItemProviders are also created in this way, using the generated ItemProviderAdapterFactory, EXTLibraryItemProviderAdapterFactory in this example. You could override createXXXAdapter() methods to create an instance of your custom ItemProvider and register your extended ItemProviderAdapterFactory.

Lets first do this the traditional way (without DI).

1. Following the Generation Gap Pattern article by Heiko, you could change the extlibrary.genmodel to output the generated code in a src-gen folder of the edit plugin and put your extensions in src folder. In this tutorial, to further isolate our changes we create a new extension plugin, org.eclipse.example.library.edit.extension.

2. Create a new class BookItemProviderExtension which extends BookItemProvider within the new plugin.

public class BookItemProviderExtension extends BookItemProvider {

	public BookItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return super.getText(object) + " (" + ((Book) object).getPages() + ") ";
	}

}

3. Create EXTLibraryItemProviderAdapterFactoryExtension which extends EXTLibraryItemProviderAdapterFactory

public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	@Override
	public Adapter createBookAdapter() {
	    if (bookItemProvider == null)
	    {
	      bookItemProvider = new BookItemProviderExtension(this);
	    }
	    return bookItemProvider;
	  }

}

4. Modify the editor code to use the new ItemProviderAdapterFactoryExtension


public class EXTLibraryEditor extends MultiPageEditorPart
  implements
    IEditingDomainProvider,
    ISelectionProvider,
    IMenuListener,
    IViewerProvider,
    IGotoMarker
{
...
 protected void initializeEditingDomain()
	  {
	    // Create an adapter factory that yields item providers.
	    //
	    adapterFactory = new ComposedAdapterFactory(ComposedAdapterFactory.Descriptor.Registry.INSTANCE);
	    adapterFactory.addAdapterFactory(new ResourceItemProviderAdapterFactory());
	    adapterFactory.addAdapterFactory(new EXTLibraryItemProviderAdapterFactoryExtension());
	    adapterFactory.addAdapterFactory(new ReflectiveItemProviderAdapterFactory());
    	...
	}
	...
}

If you run the editor again, by including the new extension plugin you would get the same result as before.  With this step, we managed to isolate our changes to a new plugin.

Extending ItemProviders using Google Guice

Although, we managed to isolate our changes without changing the generated code, this might not be enough in the long run. What if

1. you need multiple ItemProvider implementations for the same EMF Object and want to switch between them
2. you want to extend many ItemProviders in the inheritance hierarchy. For example, you need to change PersonItemProvider and WriterItemProvider (which extends PersonItemProvider).

Although, you don’t need to use DI to solve these problems, DI would do it for you in a simpler, cleaner way. In this tutorial we will use Google Guice to achieve this. Google Guice is cool light weight dependency Injection framework. You could inject your dependencies just by writing few lines of code and some annotations. If you don’t like annotations, you could even use Guice without them. If you are not familiar with Google Guice read, Getting Started.

Lets go ahead and “Guicify” our earlier example. We start with simple modifications and go on to detailed ones in later steps.

1. Firstly, you need to add a dependency to Google Guice from our org.eclipse.example.library.edit.extension plugin.

Google Guice is currently not publicly available as an eclipse plugin. There is a bugzilla request to add it to the Orbit bundle. It is however available with Xtext as an eclipse plugin. Since I have Xtext in my target, I use this in my tutorial. If you don’t have this, you need to add Google Guice as an external jar to your project.

2. The next step would be to get rid of the “new” statements in the extended ItemProviderFactory. This is what binds the ItemProviderAdapterFactory to a specific ItemProvider implementation. We use Google Guice field injection to inject BookItemProvider.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	@Inject
	protected BookItemProvider bookItemProvider;

	@Override
	public Adapter createBookAdapter() {
		return bookItemProvider;
	}

}

3. We now need to create a Google Guice Module to bind the extended ItemProvider. So go ahead and create a module as follows:


public class LibraryModule extends AbstractModule implements Module{
	@Override
	protected void configure() {
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
	}

}

4. You could also inject the extended ItemProviderAdapterFactory into our editor. Since we don’t want the editor to have a Google Guice dependency, we make the following changes to the module and extended ItemProvider.


public class LibraryModule extends AbstractModule implements Module{

	private final AdapterFactory adapterFactory;

	public LibraryModule(AdapterFactory adapterFactory) {
		this.adapterFactory = adapterFactory;
	}

	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
	}

}


public class BookItemProviderExtension extends BookItemProvider {

	@Inject
	public BookItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return super.getText(object) + " (" + ((Book) object).getPages() + ") ";
	}

}

5. Now we need to create a Guice Injector.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	public LibraryItemProviderAdapterFactoryExtension() {
		Guice.createInjector(new LibraryModule(this));
	}

	@Inject
	protected BookItemProvider bookItemProvider;

	@Override
	public Adapter createBookAdapter() {
		return bookItemProvider;
	}

}

6. Run it and you get the same result as before.

You could now inject a different implementation of the ItemProvider by only creating a new binding in the module file.

That was a rather trivial example. Lets take a more significant example where we need to make changes to PersonItemProvider and WriterItemProvider (which extends PersonItemProvider).

The Extended Library Model example, by default displays the Lastname of the Writer for the attribute Name. This comes from the following lines of code in PersonItemProvider, the superclass of WriterItemProvider.


  @Override
  public String getText(Object object)
  {
    String label = ((Person)object).getLastName();
    return label == null || label.length() == 0 ?
      getString("_UI_Person_type") : //$NON-NLS-1$
      getString("_UI_Person_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

Lets change this to display the Firstname instead of  Lastname.

1. Create a new extension ItemProvider for PersonPersonItemProviderExtension and override the getText() method as follows


public class PersonItemProviderExtension extends PersonItemProvider {

	@Inject
	public PersonItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		String label = ((Person) object).getFirstName();
		return label == null || label.length() == 0 ? getString("_UI_Person_type") : //$NON-NLS-1$
				getString("_UI_Person_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$

	}

}

2. Inject the extended PersonItemProviderExtension into the ItemProviderAdapterFactory extension.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {
	...

	@Inject
	protected PersonItemProvider personItemProvider;

	@Override
	public Adapter createPersonAdapter() {
		return personItemProvider;
	}

}

3. Update the Google Guice Module


	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(PersonItemProvider.class).to(PersonItemProviderExtension.class).in(Scopes.SINGLETON);
	}

If you run the code now, you will see that we haven’t got the expected results yet. This is because, the WriterItemProvider still extends PersonItemProvider and not PersonItemProviderExtension, where we integrated the changes. We could go ahead and create a new WriterItemProviderExtension which extends PersonItemProviderExtension. But in this way we would tie the WriterItemProviderExtension to PersonItemProviderExtension implementation. We would like to inject each of these extensions without creating any inter-dependency between any of them.

4. We can change inheritance to delegation and use injection again here, that is, inject PersonItemProviderExtension into WriterItemProviderExtension and delegate the getText() call.

Changing inheritance to delegation however comes at the cost of some Java specific issues which I will talk about in a later part of my article.


public class WriterItemProviderExtension extends WriterItemProvider {

	@Inject
	private PersonItemProvider personItemProvider;

	@Inject
	public WriterItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return personItemProvider.getText(object);
	}

}

5. Don’t forget to update your EXTLibraryItemProviderAdapterFactoryExtension and Guice Module to bind WriterItemProviderExtension.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	...

	@Inject
	protected WriterItemProvider writerItemProvider;

	@Override
	public Adapter createWriterAdapter() {
		return writerItemProvider;
	}

}


	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(PersonItemProvider.class).to(PersonItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(WriterItemProvider.class).to(WriterItemProviderExtension.class).in(Scopes.SINGLETON);
	}

If you run the code now, you will see that FirstName is displayed as Name attribute of Writer, instead of Lastname.

I will cover this in the second part of the tutorial. Hang on!

Sources

Download

Advertisements

Formal Requirement Specification with Xtext & ProR

May 17, 2011 3 comments

Requirements engineering is a critical yet difficult part of software development. Although a lot has been said and written about using formal methods in Requirements engineering, it is often not practiced. Systems requirements are mostly couched in very high level terms and made comprehensible to end users.  “Formal” Requirement Specifications almost always remains a myth or a specialist thing. A good alternative here would be to use DSLs (Domain Specific Languages) for requirements specification. Or better mix DSLs with an standard requirements specification format like OMG ReqIF. A DSL offers the flexibility of fine tuning the specification language so that it is moderate enough for customers to understand and also for a specialist to process it.

To realize this and as a first step towards filling the void I was talking about last year, a open source toolset has been developed by itemis in collaboration with Düsseldorf University in the scope of VERDE project. The toolset consists of a OMG ReqIF based metamodel, a UI (named ProR) to edit requirements in a DOORS like table view manner and an Xtext extension to integrate DSLs. The tutorial talks about the Xtext extension to ProR editor.

To see the tool in action have a look at the screencast by Ömer Gürsoyhttp://www.guersoy.net/knowledge/crema

ProR with Xtext Integration


Installation


Creating a formal language

Please follow the  Xtext Userguide to learn how to create formal languages using Xtext

Integrating the formal language to ProR

  • Create a new Eclipse plugin Project
  • Add the following plug-in dependencies to the new project.


The last two dependencies are the newly generated plugins for your language by Xtext

  • Create a new extension for the extension point de.itemis.pror.presentation.xtext.configuration.
  • language is the id of the newly created language.
  • extension is the file extension of the newly created language

Please note that this is not the project/plugin id but the language id 

For injector, implement the interface de.itemis.pror.presentation.xtext.core.IXtextInjector or extend de.itemis.pror.presentation.xtext.core.AbstractXtextInjector. The getInjector() method should return the Google injector of your Xtext language UI project. For example,

public class DomainModelInjector extends AbstractXtextInjector {

	@Override
	public Injector getInjector() {
		return DomainmodelActivator.getInstance().
                               getInjector(getLanguage());
	}

}

Please note that it might be necessary to export the *.ui.internal package from the UI project to access the XXXActivator instance as above. We will improve this as soon as Xtext makes the injector public in a different way.

Running ProR with formal language integration

  • Launch the newly created plugins from your workspace which has ProR already in the target



Needless to say, build the newly created plugins and add them to the ProR target after testing

  • Create a new RIF file in ProR
  • Follow ProR documentation and create a datatype for the newly created language. In the example below we use an existing datatype in ProR, T_String32k
  • Create a new Presentation Configuration.
  • For Language enter the Language id of you new language as before


  • For Presentation Datatype, select an existing datatype. T_String32k in the example below


  • Double click on the requirement attribute which is linked to the new datatype in the ProR grid. Description in the example below.
  • A pop-up dialog opens up with a full-fledged editor for the newly created language. The editor supports most Xtext features like syntax highlighting, auto completion, error highlighting etc.

  • Enter your formal description in the editor
  • Click on a different cell to accept the new value
  • Save the file to persist the contents as usual

(e)Wiki – A model based Wiki framework

October 20, 2010 3 comments

1. What’s Wiki with a (e)?

(e)Wiki is a Wiki markup generation framework based on Mylyn WikiText and EMF. The framework allows to read and write different Wiki markup formats – this time, the model driven way. If you have an existing model based tool chain, the framework could easily fit in there to generate some quick documentation, without worrying about markups. You could reuse all the WikiText markup parsers and document builders, combined with the power of EMF features like EMF persistence, EMF notification etc.

The framework was developed as part of a customer project to generate Wiki documentation out of EMF based domain models. The framework is currently not open sourced and made available to public. Talks are on with the customer on this however.

The article gives an overview about the framework and its features. The intention of the article is also to demonstrate the extensibility of two powerful Eclipse frameworks – Mylyn WikiText and EMF.

2. Architecture

(e)Wiki is an addon to Mylyn WikiText. WikiText is a framework which supports parsing/editing Wiki markup formats like MediaWiki, Textile, Confluence, TracWiki and TWiki and writing them as HTML, Eclipse Help, DocBookDITA and XSL-FO. WikiText however doesn’t have an internal data model for documents (like DOM for XML). (e)Wiki adds this missing layer to the WikiText framework. Instead of using a POJO data model, (e)Wiki uses a powerful EMF based metamodel. Using (e)Wiki you could read all the above markup formats and generate a (e)WikiModel based on EMF. The model could then be written out using any of the WikiText document builders.

3. (e)WikiModel

(e)WikiModel is an EMF based Wiki metamodel. It is a generic metamodel for any Wiki Language.

(Only partial model is shown above)

(e)WikiModel is not aimed to be used as a DSL for your domain. It is intended to capture the documentation aspects of your domain model. To better describe semantic information use a DSL of your own using a framework like Xtext.

4. Features

(e)Wiki currently delivers the following:

  • A metamodel for Wiki called (e)WikiModel
  • (e)WikiEditor to view/edit (e)Wiki files
  • Previewing of rendered (e)Wiki content
  • UI extensions to convert existing markups to (e)WikiModel
  • Generating Textile and Redmine markup (together with HTML and Docbook as already supported by WikiText)
  • Feature to split generated Textile and Redmine markup files into subpages
  • Add Table of Contents to generated output

5. Working with (e)Wiki

5.1. Creating a (e)WikiModel

A (e)Wiki instance could be created either from Eclipse UI or programatically.

5.1.1. Creating from UI

Create a markup file within the eclipse workspace.

Right click the markup file and invoke eWiki -> Generate eWiki.

An (e)Wiki file is created in the same folder as selected file with extension .ewiki.

5.1.2. Creating with code

WikiText is based on “Builder” design pattern. (e)Wiki uses the same pattern and adds a new DocmentBuilder, the EWikiDocumentBuilder. The snippet below shows how to convert existing markup (Textile in this case) to (e)Wiki.


final String markup = "h1. Quisque"

final ResourceSet resourceSet = new ResourceSetImpl();
final Resource eWikiResource = resourceSet.createResource (URI.createFileURI("..."));

MarkupParser parser = new MarkupParser();
parser.setMarkupLanguage(new TextileLanguage());
EWikiDocumentBuilder eWikiDocumentBuilder = new EWikiDocumentBuilder(eWikiResource);
parser.setBuilder(eWikiDocumentBuilder);
parser.parse(markup);

eWikiResource.save(Collections.EMPTY_MAP);

The snippet below shows how to create a (e)WikiModel using model APIs. If you have worked with EMF generated model API code, this is no different, except that (e)Wiki adds additional convenient factory methods.

final ResourceSet resourceSet = new ResourceSetImpl();
final Resource eWikiResource = resourceSet.createResource (URI.createFileURI("..."));
Document document = EwikiFactory.eINSTANCE.createDocument();
document.setTitle("Lorem ipsum");

EwikiFactory.eINSTANCE.createHeading(document, 1, "Quisque");

Paragraph paragraph = EwikiFactory.eINSTANCE.createParagraph();
document.getSegments().add(paragraph);
EwikiFactory.eINSTANCE.createText(paragraph, "Lorem ipsum dolor sit amet, consectetur adipiscing elit.");

BullettedList bullettedList = EwikiFactory.eINSTANCE.createBullettedList();
document.getSegments().add(bullettedList);
EwikiFactory.eINSTANCE.createListItem(bullettedList,"Mauris");
EwikiFactory.eINSTANCE.createListItem(bullettedList,"Etiam");

eWikiResource.getContents().add(document);
eWikiResource.save(Collections.EMPTY_MAP);

5.2. (e)WikiEditor

The (e)WikiEditor provides a tree based editor and a preview tab.

5.2.1. Tree editor tab

The tree editor provides a tree view of the (e)Wiki file. Although the editor could be used to change the file, it is rare that rich text would be edited in this way.

5.2.2. Preview tab

The preview tab provides a preview of the (e)Wiki file as rendered by your default browser.

 

5.3. Generating output

The (e)Wiki file could be converted to HTML, Docbook, Textile and Redmine markup formats.

5.3.1. Generating output from UI

Right clicking the (e)Wiki file brings up the context menu for conversions.

5.3.2. Generating output from Code

You could use any of the existing DocumentBuilders in WikiText to generate output from a (e)WikiModel. The snippet below shows how to convert an (e)Wiki instance to HTML programatically using the WikiText HTMLDocumentBuilder.

final IFile eWikiFile = ...;
ResourceSet resourceSet = new ResourceSetImpl();
final Resource wikiResource = resourceSet.getResource(URI.createFileURI(eWikiFile.getRawLocation().toOSString()), true);
StringWriter writer = new StringWriter();

EWikiParser parser = new EWikiParser();
parser.setMarkupLanguage(new EWikiLanguage());
parser.setBuilder(new HTMLDocumentBuilder(writer));
parser.parse(wikiResource);

final String htmlContent = writer.toString();

Provisioning your Target Platform as local p2 site

October 9, 2010 12 comments

Provisioning your target platform from a p2/update site is a gem of a feature that was released with Eclipse 3.5. Chris called it “fantasy come true!” and so did many others. You could read the details on how this works from this article by Chris. (On the other hand, if you haven’t used target platforms at all, stop doing any further eclipse plugin development and read about it here first, right from the target platform expert Ekke).

Introduction

Provisioning your target platform from a p2 site allows you to basically download any eclipse flavor you like, and with the click of a button set up your target platform. PDE downloads all the plugins required for your target platform automatically from different software sites based on your target definition and builds your workspace. Although this is a great feature, using this in your workflow has some shortcomings.

  1. If your target platform is large (which is mostly the case), this would end up in a lot of bandwidth usage every time the bundles are downloaded from different software sites. Also, if you do not have high speed internet or you have restricted internet access, the time to initialize the target platform could be very long.
  2. Not all bundles are available from p2/update sites. Although p2 marketing has been quite successful recently, many plugin providers still doesn’t host their products as p2/update sites. Although, you could have such plugins as local folders or in a shared network folder, this takes away the power of provisioning.
  3. Many p2/update sites don’t believe in archiving older versions and continue providing them as p2/update sites. Hence you have no guarantee that your platforms based on some older bundles will work forever.
  4. Many development projects version their target platforms and maintains them in versioning systems like SVN. This is a prerequisite to reconstruct older target platforms. This, however, is not possible with the approach above.

If you have a closer look, you could see that many of the limitations above stems from the fact that the software sites referenced are not local, nor within your control. All of them could be avoided if you provision your target platform as a local p2/update site. This means, instead of you downloading bundles from a public software site, you (and your team) downloads them from a local p2/update site that you have set up with all your required target platform plugins/features. In this article, I describe a workflow in setting up such a local p2/update site for your target platform.

1. The aggregation task

The first step would be to aggregate all the plugins and features required by your target platform. You could easily do this by creating a target definition file which references the software site of your bundle providers using “New -> Target Definition“. If your bundle provider doesn’t have an p2/update site, reference them from a local download folder.

You will never share/distribute this “setup” target file

A sample target definition file could look like this.

If you want your target to support multiple platforms, make sure to check this checkbox while you add a software site.

2. Testing the target

Using the target definition editor, set the target by clicking “Set as Target Platform“.

Additionally, you could set the target platform in “Preferences -> Plug-in Development -> Target Platform” and selecting the newly created target. If all the projects in your workspace builds fine, then you have setup the target platform correctly. Else repeat Step 1 to add the missing plugins and the workspace errors vanish.

3. Creating a new target platform feature

Using the feature project wizard, create a new feature for the target platform.

Make sure to properly name and version the feature.

In the plugin selection page, the plugins listed are the ones in your target platform together with your workspace plugin projects. Select all plugins except the ones in your workspace.

4. Creating a p2 site for target platform

You could do this either using PDE or using Buckminster.

If you want your target platform to support multiple development platforms (OSX, Linux, Windows etc.) use Buckminster (unfortunately, PDE had some export issues doing this)

4.1. Creating p2 update site using PDE

Create a new update site project using the update site project wizard.

Create a proper category and add the target platform feature you created earlier as child, using “Add Feature…“.

Build the update site using “Build/Build All“.

If the build is successful, the update site project could look like this.

4.2 Creating p2 site using Buckminster

Install Buckminster from update site http://download.eclipse.org/tools/buckminster/updates-3.6 (for Helios).

Create a buckminster.properties file in your target platform feature project.

## buckminster.properties ##

#Where all the output should go

buckminster.output.root=${user.home}/project/site

# Where the temp files should go

buckminster.temp.root=${user.home}/project/tmp

# How .qualifier in versions should be replaced

qualifier.replacement.*=generator:lastRevision

target.os=*

target.ws=*

target.arch=*

Right click the target platform feature project and select “Buckminster -> Invoke Action…

Select buckminster.properties file created using “Workspace” button followed by “site.p2” from “Matching items” list

Select “OK” to close the dialog.

The p2 site would be created in feature folder pointed by buckminster.output.root in your buckminster.properties file.

You do not have to create a update site project (as you did in the PDE approach) while using Buckminster

5. Deploying the target platform

Identify a server space in your local network where the p2 site could be deployed (preferably a webserver). Copy the p2 site contents to this location.

6. Creating the final target definition

Create a new target file and point it to the newly hosted p2 site (similar to Step 1)

7. Testing the final target platform from local p2 site

Open the target file and set the target using “Set as Target Platform” as before. The bundles from the local p2 site will now be downloaded and your workspace built again. If you have zero errors, you have succeeded in setting up a p2 update site for your target platform. Distribute the final target file to your development team and they could now set their target platform in the same way.

8. Managing different versions of target platforms

You could provision multiple versions of target platforms at the same local p2 site. All you need to do is to create a new feature for the new target platform, add it to the update site site.xml, build and deploy the target platform as before and distribute the updated target file.

Filling the void

September 6, 2010 11 comments

The search…

As Andreas blogged a while ago, we at itemis are working on a requirements editor as part of a public research project. Developing a full-fledged requirements management tool is not in the scope of the project. What is of interest is to have a solution to enable traceability from requirements to design, all the way to source code. Looking around for an eclipse based open source solution for requirements management, we realized that this space is currently empty (except for ORMF which somehow didn’t fit our needs or give us confidence).

Having decided to move on with a solution of our own, the next step was to shop for a suitable meta-model for requirements. We weren’t quite unlucky as before here. We had two choices here, SysML or RIF.

SysML or RIF (now ReqIF)

SysML is a general purpose modeling language for systems engineering applications. RIF on the other hand is an interchange format for exchanging requirements between partners. Although SysML and RIF can capture requirements, we found RIF more apt for our needs, as its scope is limited to requirements management. Currently, RIF is up for standardization at OMG with a new name ReqIF.

RIF offers a schema based XML format as well as its UML meta-model in Enterprise Architect. To add to our confidence in RIF, IBM had already incorporated an importer/exporter for the RIF in IBM Rational DOORS . Several other companies, like MKS, are also working on an implementation of the RIF specification.

Bringing RIF to Eclipse world

The first step in building some tooling around RIF was to bring it into an eclipse compatible meta-model. Naturally, the first choice was Ecore, the now de-facto eclipse meta-model. However, converting RIF from an Enterprise Architect format to a Ecore XMI was a herculean task. After failing to do the conversion with the steps I detailed here, we went on a journey of export-import-export from tool to tool and finally had a RIF compatible Ecore meta-model.

 

The RIF Model

The RIF model is a generic requirements meta-model. (From another perspective, too generic to be called a requirements meta-model). The meta-model allows to define requirement types and requirement objects, fitting one of those types. Take for instance, a typeFunctional Requirement” with attributes such as “ID“, “Name” and “Status“. You could use RIF to create a SpecType named “Functional Requirement” with matching AttributeDefinitions for “ID“, “Name” and “Status” . The AttributeDefinitions could also have DatatypeDefinitions. Based on the defined SpecType you could create SpecObjects having AttributeValues for all the AttributeDefinitions.

This is a birds-eye view of the RIF meta-model.

The Editor

We retained the generic nature of RIF in the editor as well. To start with, you could create type specifications using a tree based editor.

Based on the type specifications, the editor at runtime creates an eclipse forms based user interface, lays it out neatly and lets you edit your requirements. As you could see below, the editor creates a tree to display the requirement hierarchy on the left. On the right, the editor lets you edit the requirement attributes by creating matching UI elements to handle the basic datatypes like StringIntegerEnumeration etc.  you created during RIF type specification using the tree editor.


Editor Extensions

You could easily build extensions to the editor using the provided extension points. If you need a specific UI behavior, you could define a new complex datatype and build UI customizations for this complex datatype.  For example, we extended the basic editor to have an embedded DSL editor (using Xtext) to capture detailed requirements.

RIF XML

The RIF editor loads and writes out XML conforming to the RIF schema.

Filling the void

We aren’t quite there yet. We, of course need more community interest and support to take this forward. Once we do, we hope to fill the void completely and have an open source solution for requirements management.

Signing Eclipse Plugins using Self-signed Certificates

September 4, 2010 4 comments

Overview

Signing an eclipse plugin is the process of stamping an eclipse plugin with a certificate, by which the plugin could reveal its authenticity to anyone who installs and executes it. Although, by default, eclipse generates unsigned plugins, starting 3.3, eclipse began verifying the integrity of plugins installed via update sites by checking for an attached digital certificate and issuing a warning when an unsigned content is found. Luckily, eclipse doesn’t prevent you from running the unsigned content. However, if you would like to distribute your eclipse plugins or host them via an update site, it is important that your plugins are signed. This would allow the users to reliably identify you as the publisher of the plugin and make sure that the plugin has not been altered since it was uploaded to the update site. This also avoids the user getting a warning message as below.

Certificates

The signing of a plugin is done using a certificate.  A certificate is a digitally signed statement from an entity (person, company etc.), saying that the public key of some other entity (for example, a Java class file) has a particular value. There are two types of certificates:

  • Self-signed certificates: A self signed certificate is what you could create on your own to sign your plugins.  When users install plugins signed with self-signed certificate, they are presented with a dialog similar to the one below. The users could verify the certificate and install the plugins if they feel the source is trustworthy.

  • Certificates signed by a trusted third-party: When a certificate issued by a trusted third-party like Verisign is used, the user will not be presented with the warning/trust dialog and the plugins are installed directly. However, such certificates have high cost implications. If your plugins would be made available on Eclipse.org, they will be signed with the foundation certificate (refer http://wiki.eclipse.org/JAR_Signing for more information). However, the process of signing with such certificates is not in the scope of this article.

JAR Signing

Eclipse doesn’t define a mechanism of its own for signing plugins. Since all eclipse plugins are JARs (well almost), eclipse uses the java mechanism of JAR signing to sign plugins. Also, eclipse doesn’t come with any tooling for JAR signing (until Bugzilla request 11485 is closed). Hence you have to rely on command line tools keytool and  jarsigner (keytool.exe and jarsigner.exe on Windows ) that comes with java to get the job done.

Before you begin, make sure to set the environment variable $JAVA_HOME to the Java location. To identify the location of installed Java, open Eclipse "Help > About" dialog and click on "Configuration Details".  Look for the value  "java.home=<some path>" and copy the entire path. On Windows replace "$JAVA_HOME" with "%JAVA_HOME%".

The commands below are for Mac OSX/Linux and uses “sudo” to make updates.  On Windows leave out “sudo“.

1. Creating a self-signed certificate

This step creates a self-signed certificate with public and private key and stores it in a keystore. A keystore is the location where all keys and certificates are stored. This is simply a file where your digital certificates live. We will use the keystore of Java to store the certificates. This is at $JAVA_HOME/lib/security/cacerts where $JAVA_HOME is the location of your Java installation.

sudo keytool -genkey -dname “cn=<common name>, ou=<organizational unit>, o=<organization>, c=<country>” -alias <alias name> -keystore <keystore location> -storepass <keystore password> -validity <validity of certificate in days>

For example,

sudo keytool -genkey -dname “cn=Nirmal Sasidharan, ou=Pf, o=itemis, c=DE” -alias “nirmal” -keystore $JAVA_HOME/lib/security/cacerts -storepass “changeit” -validity 180

The default Java keystore password is “changeit” unless you have changed it. The command would ask for a password to be created for the alias. Enter a password, confirm it and remember it for the next step and for later.

2. Signing the JARs

To sign the plugin and feature JARs with the certificate created by the step above run the following command.

jarsigner –keystore <keystore location> -storepass <keystore password> -verbose

For example,

jarsigner -keystore $JAVA_HOME/lib/security/cacerts -storepass “changeit” -verbose de.itemis.project.updatesite/plugins/de.itemis.plugin_1.0.0.jar nirmal

When asked for password, enter the password for alias created with the step above. This signs the JAR using the certificate identified by alias.

The command signs one JAR at a time. To do batch signing, you could create a simple shell script (or an equivalent batch file on Windows) as below:

#!/bin/bash

##jarbatchsign.sh

export JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Home

for i in $1/*.jar

do

jarsigner -keystore $JAVA_HOME/lib/security/cacerts -storepass

“changeit” -verbose -keypass $3 $i $2

done

Invoke the script as

./jarbatchsign.sh <path to folder containing jars> <alias name> <password for alias>

For example,

./jarbatchsign.sh de.itemis.project.updatesite/plugins/ nirmal aliaspassword

3. Testing the signed plugins

Delete your own certificate from the keystore before you test the update site with the signed plugins (see “Deleting certificate from keystore” below).

Restart eclipse and install the signed plugins from the update site. If all is well, a trust dialog as described before appears.

Other Useful functions

Listing certificates in keystore

sudo keytool -list -keystore <keystore location> -storepass <keystore password>  -v -alias <alias name>

For example,

sudo keytool -list -keystore $JAVA_HOME/lib/security/cacerts -storepass “changeit” -v -alias nirmal

Deleting certificate from keystore

sudo keytool -delete –keystore <keystore location> -storepass <keystore password>

For example,

sudo keytool -delete -keystore $JAVA_HOME/lib/security/cacerts -storepass “changeit” nirmal

Verifying signed jars

jarsigner -keystore <keystore location> -storepass <keystore password>  -verify -verbose -certs

For example,

jarsigner -keystore $JAVA_HOME/lib/security/cacerts -storepass “changeit” -verify -verbose -certs de.itemis.project.updatesite/plugins/de.itemis.plugin_1.0.0.jar

Disabling security check

You could disable the eclipse certificate check all together using the startup option -Declipse.p2.unsignedPolicy=allow. See Bug 235526.

Split Packages – An OSGi nightmare

September 2, 2010 2 comments

A “split package” is a pretty old Java term where you have packages in different libraries with the same name providing related (or sometimes unrelated functionality).  The classes in one package can access classes in the other package (with the same name) across library boundaries without any problem at both compile and runtime. This is true even for package-private classes. This works fine in the Java world as you deal with only a single hierarchical class loader. If you bring this concept to the OSGi (or Eclipse) world, you are in for trouble, where each bundle has its own class loader. This is what happened to me while I was doing some heavy refactoring recently.

I had the task of making a “monolithic” Eclipse system modular. Refactoring in JDT did a great job and in the end I had a modular system with no compile time errors. The pain started when I ran the application and started getting exceptions, I have never seen until now working with Eclipse framework.

Here’s what I did in short. I split a plug-in into two plugins, “Plugin A” and “Plugin B”, where “Plugin A” contained the main functionality and “Plugin B” was providing some library functions. So naturally “Plugin A” depended on “Plugin B” (or in OSGi terms, “Plugin A” had a “Require-Bundle: Plugin B” declared). Though not intentional, both plugins ended up having a package with the same name “SplitPackage”. A “Class A” in “SplitPackage” in “Plugin A” had a method call on “Class B” in “SplitPackage” in “Plugin B”.

PDE did not complain nor did I get any compile time errors. Now when I ran it, I got a “java.lang.IllegalAccessError:  tried to access Class B from Class A”. I kept wondering why I would get this linkage error which I should normally get at compile time from PDE. After spending half a day on it, I found the cause. “Class B” was declared as package-private . The moment I changed this into a public class, all was well again (although this wasn’t my final solution).

The root cause here was that OSGi loaded “Plugin A” and “Plugin B” using different class loaders and the packages “SplitPackage” in both bundles  are not the same and they live in two different worlds with a clear boundary between them. “Class A” in “Plugin A” cannot see package-private “Class B” after the bundles are resolved.  This resulted in the java.lang.IllegalAccessError exception at runtime. The exception doesn’t pop up at compile time as a Java compiler doesn’t know these boundaries and sees them as split packages. May seem strange but true!

Moral of the story is that never have split packages in the Eclipse environment. For OSGi, a package is the atomic unit and it should be as cohesive as possible. If you have functionality split into two packages with the same name, then they are not cohesive and ideally the packages should have different names or the classes should live within one package. If you really need split packages, make one of them an Eclipse fragment plugin. A fragment plugin always lives within a host plugin and both of them are loaded by the same class loader.

Tags: , , ,