Archive

Archive for the ‘Eclipse’ Category

Eclipse Project Set Editor

January 15, 2013 2 comments

psfeditorAn Eclipse Project Set file (.psf) enables quick export and import of files from a repository like Git, SVN etc. Eclipse currently supports exporting files in a repository as a Project Set file. An editor for an existing PSF file is currently missing. The means updating an existing PSF file (for example, adding a new project, removing an existing project etc.) means editing PSF XML by hand!

The Project Set Editor provides a simple user interface to view and edit PSF files (similar to Manifest Editor or Target Definition Editor). The editor also allows directly importing the artifacts in a PSF file.

You can find the project at Eclipse Labs: http://code.google.com/a/eclipselabs.org/p/psfeditor/

Doors are open for testers and committers.

 

Tags: , ,

2 Fast thrice Furious

August 3, 2011 1 comment

Preparing to make the initial code contribution for RMF, we ran our RIF/ReqIF metamodels through several performance tests. To start with, we tested the load and save times of RIF files based on some industry samples. To get some comparison data, we generated XMI files using the same data held in RIF XML files and tested the load/save time against it. The results are quite promising.

Before we go into the details of the tests, its better to define two components involved in our test comparisons.

  • The customized RIF XML loader (a.k.a RMF Loader) and serializer (a.k.a RMF Serializer) for loading/saving OMG RIF XML files into RIF Ecore metamodel (read more on the metamodel implementation here).
  • The default RIF XMI loader and serializer for  loading/saving RIF XMI files into RIF Ecore model (this is not in the scope of RMF. We use this only to get some comparison).

Here are some highlights from our tests.

  • A 32MB RIF XML file is loaded in 14.4 seconds by RMF loader where as the same data in XMI format is loaded by the default EMF XMI loader in 22.2 seconds (and 70 mins(!!) without any optimizations to the XMI loader)
  • The average time taken to load per MB of data from RIF XML is 0.5 seconds, whereas RIF XMI takes 1.63 seconds per MB. For save, average time taken per MB of data to RIF XML is 0.09 seconds, whereas RIF XMI takes 1.22 seconds per MB
  • The load and save time for RIF XML files by RMF loader/serializer increases linearly with size

Tags: , , , ,

Requirements Modeling Framework (RMF)

July 18, 2011 2 comments

Over the last few months some guys at itemis and Düsseldorf University have been working closely to bring out a solution to the open in an effort to lessen the big gap we have currently in Eclipse Ecosystem in the area of Requirements Management (RM). Requirements Modeling Framework (RMF) has been proposed to Eclipse Foundation as an open source project under Model Development Tools Project.

Scope of RMF

The scope of the project, as described in the proposal, is to provide an implementation for the OMG ReqIF standard (just as the Eclipse UML2 project provides an EMF based metamodel for OMG UML). Requirements management tools could then base their implementation on the provided ReqIF metamodel. It is also in the scope of the project to deliver a requirements authoring tool, again based on the ReqIF metamodel and optionally a generic traceability solution.

Significance of ReqIF

ReqIF (Requirements Interchange Format) is an open standard from OMG for requirements exchange (I had blogged a while ago on how we arrived at ReqIF (earlier called RIF)). Many tools like DOORS already support a snapshot export to this format. Having an EMF based metamodel for ReqIF, opens up the Eclipse framework for integration and new tool development in the area of Requirements Engineering.

ReqIF in a nutshell


ReqIF offers a generic metamodel to capture requirements (The generic nature is at times highly criticized. More on that later). The figure above shows a bird’s-eye view of the metamodel. The metamodel allows creation of requirement types (SpecType) with different attributes types and also instances for it (SpecObject). Since ReqIF also carries the metadata (in the form of SpecTypes), it makes it possible for any tool that understands ReqIF to process the data.

ReqIF also allows creating hierarchy for requirements, grouping them and controlling user access. To support rich text, the metamodel reuses XHTML.

Generic nature and “meta”ness of ReqIF

Anyone expecting a requirements metamodel might be surprised at first when they have a look at ReqIF. You hardly find a term called “requirement” in there. ReqIF is more at a higher meta-level, that is M1. It has been designed this way for a reason.

Requirements Management is an evolving field and until recently never crossed company boundaries. With tighter collaboration between partner companies the benefits of applying RM across company borders became known. The field being highly dominated by commercial products gave barely any chance for standardization. A group of companies in the automotive field, realized that with such a diversity existing hardly any unification is possible. They gave birth to this generic requirements interchange format which was later adopted by OMG and standardized.

The generic structure of ReqIF allows companies to use the tools of their choice to do requirements management and use ReqIF as the standard exchange mechanism and even do round trips. ReqIF is not the only standard to allow this generic nature in the field of Requirements Engineering. DOORS, for example, has an extensible database allowing users to add/delete attributes. The changes to the schema are often communicated to the partners by external means to replicate the changes in their database. Since ReqIF carries the meta-information with it, the changes are migrated automatically across tools/company boundaries. Transmitting the meta-information in ReqIF could however be controlled by tooling. This nevertheless would beat the purpose of ReqIF.

It is also questionable, why such a generic model is required when we have metamodels like UML or EMF already available. The reason for this is very much the same as why EMF co-exists with UML. ReqIF is not a general purpose modeling language like UML or EMF, but more focused on requirements domain.

Integrating ReqIF

RMF provides ReqIF as an EMF based metamodel. This could be used by tool vendors as the internal tool model or an export model by means of model to model transformation. The provided loaders and serializers make sure that the ReqIF file is read/written according to the ReqIF XML Schema.

Using ReqIF natively brings all the advantages of ReqIF to the tooling as well. For example, ReqIF based tools could model the requirements domain of the company within the tooling and share it with partners along with the instance data.

What next?

The initial code contribution is planned by early August 2011 and will be made available as soon as the IP review at Eclipse is complete. If you have any suggestions/comments about RMF, would like to contribute or like to be listed as an interested party, please provide it here.

Images from FreeDigitalPhotos.net
Tags: , , , ,

10 common EMF mistakes

May 25, 2011 16 comments

1. Treating EMF generated code as only an initial code base

EMF is a good startup kit for introducing MDSD (Model-Driven Software Development) in your project. Anyone who has been introduced to EMF and had their first code generated using EMF will for sure be impressed by the short turn around time needed to have an initial version of your model based application up and running. For many, the thrill ends as soon as you need to change the behaviour of generated code and have to dig into it and understand it. It is complex, but only complex as any other similar sized framework. Many projects I have come across make a mistake at this point. They start treating the EMF generated code as an initial code base and start making it dirty by modifying it by hand. Adding @generated NOT tags in the beginning and not anymore later. At this point you part ways with “EMF way of doing things” and also with MDSD.

The customizations mostly start with the generated editor. This is quite acceptable as the editor is intended as an initial “template” and it expects you to customize it. However, this is not the case with model and edit projects. They have to be in sync with your model and this needs making some strong decisions, specifically –  “I will not touch the generated code”. Follow EMF recommended way of changing generated code or use Dependency Injection.

2. Model modification without EMF Commands

In EMF you deal a lot with model objects. Almost any UI selection operation hands over a model object to you. It is so easy to fall into the trap of changing the state of the model objects directly once you have access to it, either using the generated model API or using reflective methods. Problems due to such model updates are detected only later, mostly when Undo/Redo operations stops working as expected.

Use EMF Commands! Modify your model objects only using commands. Either directly using EMF commands, extending them or creating your own commands. When you need to club many operations as a single command use CompoundCommands. Use ChangeCommand when you are up to making a lot of model changes in a transactional way.

3. Not reacting to notifications/Overuse of notifications

EMF notification is one the most powerful features of the framework. Any change to model objects are notified to anyone who is interested in knowing about it. Many projects decide to ignore listening to model changes and react to model changes by traversing the model again to look for changes or making assumptions about model changes and combining the behaviour into UI code.

When you want to react to a model change, don’t assume that you are the only one going to originate that change. Always listen to model changes and react accordingly. Don’t forget the handy EContentAdapter class.

On the contrary, some projects add a lot of listeners to listen to model changes without applying proper filters. This could greatly slow down your application.

4. Forgetting EcoreUtils

Before writing your own utility functions, keep an eye on EcoreUtils class. Mostly, the generic function you are trying to implement is already there.

5. Leaving Namespace URI as default

When you create an Ecore model, EMF generates a default Prefix and Namespace URI for you. Mostly this is left as it is without realizing that this is the most important ID for your new model. Only when newer versions of the model needs to be released later, people start looking out for meaningful URIs.

Change the default to a descriptive one before generating code. For example, including some version information of the model.

6. UI getting out of sync with model

The EMF generated editor makes use of Jface viewers. These viewers are always kept in sync using the generated ItemProviders. However these viewers cannot always satisfy the different UI requirements. You might have to use other SWT components or Eclipse forms. At this juncture, many implementors mix UI and model concerns together and in turn lose the sync between the model and UI.

Although, you could go ahead and implement your custom viewers, the easier way would be to use EMF Databinding.

7. Relying on default identification mechanism

Every EMF object is identified by a Xpath like URI fragment.  For example, //@orders.1/@items.3. This is the default referencing mechanism within an EMF model or across models. If you look closely you could see that the default mechanism is based on feature names and indexes. The default implementation could turn dangerous if the index order changes and the references are not updated accordingly.

Extend this to uniquely identify your model object or simply assign an intrinsic ID to your model object. EMF will then use these IDs to reference your objects.

8. Forgetting reflective behavior

Many consider the reflective features of EMF as an advanced topic. It is not as complex as it seems.

Instead of depending fully on generated APIs, think about creating generic reusable functions which makes use of the meta information every EObject carries. Have a look at the implementation of EcoreUtil.copy function to understand what I mean.

9. Standalone? Why do I bother

A misunderstanding most people have is that EMF is Eclipse specific. EMF can well run outside Eclipse as “stand-alone” applications.

While developing applications based on EMF you should design your applications such that the core is not tied to Eclipse. This would allow your applications to be run even in a non OSGi server-side environment.

10. Not reading the EMF bible


This is the biggest mistake a new comer could make. The book is a must read for anyone who starts working with EMF.

Order now (Hope this fetches me a beer at the next EclipseCON Europe ;))

Tags: ,

Extending EMF ItemProviders using Google Guice – I

May 18, 2011 7 comments

EMF ItemProvider is the most important middleman in EMF framework providing all of the interfaces needed for model objects to be viewed or edited. They adapt EMF objects by providing the following:

1. Content and label providers which enables model objects to be viewed in Eclipse viewers
2. Property source for model objects to make them available in Eclipse Property View
3. Commands for editing model objects
4. Forwarding change notifications on to Eclipse viewers

EMF generates the ItemProviders for you assuming some defaults. Although the defaults work most of time, there are occasions where you would like to change these. You could make the changes directly in generated code and add @generated NOT annotations. However, by doing this you invite trouble from MDSD gurus. Because the principle says, “Don’t touch generated code!”. Ideally you would like to retain the generated ItemProviders as it is and extend them somehow with your changes.

The tutorial shows how to do this in an elegant way by using Dependency Injection (DI). We will use the EMF example of “Extended Library Model” to take us through this.

Setup

1. Create a new Eclipse workspace

2. Add the “EMF Extended Library Model Example” projects using “New Project Wizard”

3. Run the example projects and create a sample Library model

Extending ItemProviders the EMF way

The generated editor by default displays the title of Book in the tree editor. This is because of the default getText() implementation in BookItemProvider.

  /**
   * This returns the label text for the adapted class.
   * <!-- begin-user-doc -->
   * <!-- end-user-doc -->
   * @generated
   */
  @Override
  public String getText(Object object)
  {
    String label = ((Book)object).getTitle();
    return label == null || label.length() == 0 ?
      getString("_UI_Book_type") : //$NON-NLS-1$
      getString("_UI_Book_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

Lets say we want to change this such that the number of pages in the book is also displayed along with its title. You could do this by changing the generated code and adding the @generated NOT annotation.

  /**
   * This returns the label text for the adapted class.
   * <!-- begin-user-doc -->
   * <!-- end-user-doc -->
   * @generated NOT
   */
  @Override
  public String getText(Object object)
  {
    String label = ((Book)object).getTitle() + " (" + ((Book) object).getPages() + ") ";
    return label == null || label.length() == 0 ?
      getString("_UI_Book_type") : //$NON-NLS-1$
      getString("_UI_Book_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

If you now run the editor, the number of pages is displayed together with the book name.

This however breaks the Generation Gap Pattern in code generation.

Extending ItemProviders by extension

The cleaner approach is to extend the generated ItemProvider to include your changes. Now we have to also make sure that the tooling uses our customized ItemProvider. This is rather easy.

EMF ItemProviders are in fact EMF adapters. They provide some behavioural extensions to the modeled objects. The recommended way of creating adapters in EMF is using AdapterFactories. The ItemProviders are also created in this way, using the generated ItemProviderAdapterFactory, EXTLibraryItemProviderAdapterFactory in this example. You could override createXXXAdapter() methods to create an instance of your custom ItemProvider and register your extended ItemProviderAdapterFactory.

Lets first do this the traditional way (without DI).

1. Following the Generation Gap Pattern article by Heiko, you could change the extlibrary.genmodel to output the generated code in a src-gen folder of the edit plugin and put your extensions in src folder. In this tutorial, to further isolate our changes we create a new extension plugin, org.eclipse.example.library.edit.extension.

2. Create a new class BookItemProviderExtension which extends BookItemProvider within the new plugin.

public class BookItemProviderExtension extends BookItemProvider {

	public BookItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return super.getText(object) + " (" + ((Book) object).getPages() + ") ";
	}

}

3. Create EXTLibraryItemProviderAdapterFactoryExtension which extends EXTLibraryItemProviderAdapterFactory

public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	@Override
	public Adapter createBookAdapter() {
	    if (bookItemProvider == null)
	    {
	      bookItemProvider = new BookItemProviderExtension(this);
	    }
	    return bookItemProvider;
	  }

}

4. Modify the editor code to use the new ItemProviderAdapterFactoryExtension


public class EXTLibraryEditor extends MultiPageEditorPart
  implements
    IEditingDomainProvider,
    ISelectionProvider,
    IMenuListener,
    IViewerProvider,
    IGotoMarker
{
...
 protected void initializeEditingDomain()
	  {
	    // Create an adapter factory that yields item providers.
	    //
	    adapterFactory = new ComposedAdapterFactory(ComposedAdapterFactory.Descriptor.Registry.INSTANCE);
	    adapterFactory.addAdapterFactory(new ResourceItemProviderAdapterFactory());
	    adapterFactory.addAdapterFactory(new EXTLibraryItemProviderAdapterFactoryExtension());
	    adapterFactory.addAdapterFactory(new ReflectiveItemProviderAdapterFactory());
    	...
	}
	...
}

If you run the editor again, by including the new extension plugin you would get the same result as before.  With this step, we managed to isolate our changes to a new plugin.

Extending ItemProviders using Google Guice

Although, we managed to isolate our changes without changing the generated code, this might not be enough in the long run. What if

1. you need multiple ItemProvider implementations for the same EMF Object and want to switch between them
2. you want to extend many ItemProviders in the inheritance hierarchy. For example, you need to change PersonItemProvider and WriterItemProvider (which extends PersonItemProvider).

Although, you don’t need to use DI to solve these problems, DI would do it for you in a simpler, cleaner way. In this tutorial we will use Google Guice to achieve this. Google Guice is cool light weight dependency Injection framework. You could inject your dependencies just by writing few lines of code and some annotations. If you don’t like annotations, you could even use Guice without them. If you are not familiar with Google Guice read, Getting Started.

Lets go ahead and “Guicify” our earlier example. We start with simple modifications and go on to detailed ones in later steps.

1. Firstly, you need to add a dependency to Google Guice from our org.eclipse.example.library.edit.extension plugin.

Google Guice is currently not publicly available as an eclipse plugin. There is a bugzilla request to add it to the Orbit bundle. It is however available with Xtext as an eclipse plugin. Since I have Xtext in my target, I use this in my tutorial. If you don’t have this, you need to add Google Guice as an external jar to your project.

2. The next step would be to get rid of the “new” statements in the extended ItemProviderFactory. This is what binds the ItemProviderAdapterFactory to a specific ItemProvider implementation. We use Google Guice field injection to inject BookItemProvider.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	@Inject
	protected BookItemProvider bookItemProvider;

	@Override
	public Adapter createBookAdapter() {
		return bookItemProvider;
	}

}

3. We now need to create a Google Guice Module to bind the extended ItemProvider. So go ahead and create a module as follows:


public class LibraryModule extends AbstractModule implements Module{
	@Override
	protected void configure() {
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
	}

}

4. You could also inject the extended ItemProviderAdapterFactory into our editor. Since we don’t want the editor to have a Google Guice dependency, we make the following changes to the module and extended ItemProvider.


public class LibraryModule extends AbstractModule implements Module{

	private final AdapterFactory adapterFactory;

	public LibraryModule(AdapterFactory adapterFactory) {
		this.adapterFactory = adapterFactory;
	}

	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
	}

}


public class BookItemProviderExtension extends BookItemProvider {

	@Inject
	public BookItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return super.getText(object) + " (" + ((Book) object).getPages() + ") ";
	}

}

5. Now we need to create a Guice Injector.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	public LibraryItemProviderAdapterFactoryExtension() {
		Guice.createInjector(new LibraryModule(this));
	}

	@Inject
	protected BookItemProvider bookItemProvider;

	@Override
	public Adapter createBookAdapter() {
		return bookItemProvider;
	}

}

6. Run it and you get the same result as before.

You could now inject a different implementation of the ItemProvider by only creating a new binding in the module file.

That was a rather trivial example. Lets take a more significant example where we need to make changes to PersonItemProvider and WriterItemProvider (which extends PersonItemProvider).

The Extended Library Model example, by default displays the Lastname of the Writer for the attribute Name. This comes from the following lines of code in PersonItemProvider, the superclass of WriterItemProvider.


  @Override
  public String getText(Object object)
  {
    String label = ((Person)object).getLastName();
    return label == null || label.length() == 0 ?
      getString("_UI_Person_type") : //$NON-NLS-1$
      getString("_UI_Person_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$
  }

Lets change this to display the Firstname instead of  Lastname.

1. Create a new extension ItemProvider for PersonPersonItemProviderExtension and override the getText() method as follows


public class PersonItemProviderExtension extends PersonItemProvider {

	@Inject
	public PersonItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		String label = ((Person) object).getFirstName();
		return label == null || label.length() == 0 ? getString("_UI_Person_type") : //$NON-NLS-1$
				getString("_UI_Person_type") + " " + label; //$NON-NLS-1$ //$NON-NLS-2$

	}

}

2. Inject the extended PersonItemProviderExtension into the ItemProviderAdapterFactory extension.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {
	...

	@Inject
	protected PersonItemProvider personItemProvider;

	@Override
	public Adapter createPersonAdapter() {
		return personItemProvider;
	}

}

3. Update the Google Guice Module


	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(PersonItemProvider.class).to(PersonItemProviderExtension.class).in(Scopes.SINGLETON);
	}

If you run the code now, you will see that we haven’t got the expected results yet. This is because, the WriterItemProvider still extends PersonItemProvider and not PersonItemProviderExtension, where we integrated the changes. We could go ahead and create a new WriterItemProviderExtension which extends PersonItemProviderExtension. But in this way we would tie the WriterItemProviderExtension to PersonItemProviderExtension implementation. We would like to inject each of these extensions without creating any inter-dependency between any of them.

4. We can change inheritance to delegation and use injection again here, that is, inject PersonItemProviderExtension into WriterItemProviderExtension and delegate the getText() call.

Changing inheritance to delegation however comes at the cost of some Java specific issues which I will talk about in a later part of my article.


public class WriterItemProviderExtension extends WriterItemProvider {

	@Inject
	private PersonItemProvider personItemProvider;

	@Inject
	public WriterItemProviderExtension(AdapterFactory adapterFactory) {
		super(adapterFactory);
	}

	@Override
	public String getText(Object object) {
		return personItemProvider.getText(object);
	}

}

5. Don’t forget to update your EXTLibraryItemProviderAdapterFactoryExtension and Guice Module to bind WriterItemProviderExtension.


public class EXTLibraryItemProviderAdapterFactoryExtension extends
		EXTLibraryItemProviderAdapterFactory {

	...

	@Inject
	protected WriterItemProvider writerItemProvider;

	@Override
	public Adapter createWriterAdapter() {
		return writerItemProvider;
	}

}


	@Override
	protected void configure() {
		bind(AdapterFactory.class).toInstance(adapterFactory);
		bind(BookItemProvider.class).to(BookItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(PersonItemProvider.class).to(PersonItemProviderExtension.class).in(Scopes.SINGLETON);
		bind(WriterItemProvider.class).to(WriterItemProviderExtension.class).in(Scopes.SINGLETON);
	}

If you run the code now, you will see that FirstName is displayed as Name attribute of Writer, instead of Lastname.

I will cover this in the second part of the tutorial. Hang on!

Sources

Download

Formal Requirement Specification with Xtext & ProR

May 17, 2011 3 comments

Requirements engineering is a critical yet difficult part of software development. Although a lot has been said and written about using formal methods in Requirements engineering, it is often not practiced. Systems requirements are mostly couched in very high level terms and made comprehensible to end users.  “Formal” Requirement Specifications almost always remains a myth or a specialist thing. A good alternative here would be to use DSLs (Domain Specific Languages) for requirements specification. Or better mix DSLs with an standard requirements specification format like OMG ReqIF. A DSL offers the flexibility of fine tuning the specification language so that it is moderate enough for customers to understand and also for a specialist to process it.

To realize this and as a first step towards filling the void I was talking about last year, a open source toolset has been developed by itemis in collaboration with Düsseldorf University in the scope of VERDE project. The toolset consists of a OMG ReqIF based metamodel, a UI (named ProR) to edit requirements in a DOORS like table view manner and an Xtext extension to integrate DSLs. The tutorial talks about the Xtext extension to ProR editor.

To see the tool in action have a look at the screencast by Ömer Gürsoyhttp://www.guersoy.net/knowledge/crema

ProR with Xtext Integration


Installation


Creating a formal language

Please follow the  Xtext Userguide to learn how to create formal languages using Xtext

Integrating the formal language to ProR

  • Create a new Eclipse plugin Project
  • Add the following plug-in dependencies to the new project.


The last two dependencies are the newly generated plugins for your language by Xtext

  • Create a new extension for the extension point de.itemis.pror.presentation.xtext.configuration.
  • language is the id of the newly created language.
  • extension is the file extension of the newly created language

Please note that this is not the project/plugin id but the language id 

For injector, implement the interface de.itemis.pror.presentation.xtext.core.IXtextInjector or extend de.itemis.pror.presentation.xtext.core.AbstractXtextInjector. The getInjector() method should return the Google injector of your Xtext language UI project. For example,

public class DomainModelInjector extends AbstractXtextInjector {

	@Override
	public Injector getInjector() {
		return DomainmodelActivator.getInstance().
                               getInjector(getLanguage());
	}

}

Please note that it might be necessary to export the *.ui.internal package from the UI project to access the XXXActivator instance as above. We will improve this as soon as Xtext makes the injector public in a different way.

Running ProR with formal language integration

  • Launch the newly created plugins from your workspace which has ProR already in the target



Needless to say, build the newly created plugins and add them to the ProR target after testing

  • Create a new RIF file in ProR
  • Follow ProR documentation and create a datatype for the newly created language. In the example below we use an existing datatype in ProR, T_String32k
  • Create a new Presentation Configuration.
  • For Language enter the Language id of you new language as before


  • For Presentation Datatype, select an existing datatype. T_String32k in the example below


  • Double click on the requirement attribute which is linked to the new datatype in the ProR grid. Description in the example below.
  • A pop-up dialog opens up with a full-fledged editor for the newly created language. The editor supports most Xtext features like syntax highlighting, auto completion, error highlighting etc.

  • Enter your formal description in the editor
  • Click on a different cell to accept the new value
  • Save the file to persist the contents as usual

(e)Wiki – A model based Wiki framework

October 20, 2010 3 comments

1. What’s Wiki with a (e)?

(e)Wiki is a Wiki markup generation framework based on Mylyn WikiText and EMF. The framework allows to read and write different Wiki markup formats – this time, the model driven way. If you have an existing model based tool chain, the framework could easily fit in there to generate some quick documentation, without worrying about markups. You could reuse all the WikiText markup parsers and document builders, combined with the power of EMF features like EMF persistence, EMF notification etc.

The framework was developed as part of a customer project to generate Wiki documentation out of EMF based domain models. The framework is currently not open sourced and made available to public. Talks are on with the customer on this however.

The article gives an overview about the framework and its features. The intention of the article is also to demonstrate the extensibility of two powerful Eclipse frameworks – Mylyn WikiText and EMF.

2. Architecture

(e)Wiki is an addon to Mylyn WikiText. WikiText is a framework which supports parsing/editing Wiki markup formats like MediaWiki, Textile, Confluence, TracWiki and TWiki and writing them as HTML, Eclipse Help, DocBookDITA and XSL-FO. WikiText however doesn’t have an internal data model for documents (like DOM for XML). (e)Wiki adds this missing layer to the WikiText framework. Instead of using a POJO data model, (e)Wiki uses a powerful EMF based metamodel. Using (e)Wiki you could read all the above markup formats and generate a (e)WikiModel based on EMF. The model could then be written out using any of the WikiText document builders.

3. (e)WikiModel

(e)WikiModel is an EMF based Wiki metamodel. It is a generic metamodel for any Wiki Language.

(Only partial model is shown above)

(e)WikiModel is not aimed to be used as a DSL for your domain. It is intended to capture the documentation aspects of your domain model. To better describe semantic information use a DSL of your own using a framework like Xtext.

4. Features

(e)Wiki currently delivers the following:

  • A metamodel for Wiki called (e)WikiModel
  • (e)WikiEditor to view/edit (e)Wiki files
  • Previewing of rendered (e)Wiki content
  • UI extensions to convert existing markups to (e)WikiModel
  • Generating Textile and Redmine markup (together with HTML and Docbook as already supported by WikiText)
  • Feature to split generated Textile and Redmine markup files into subpages
  • Add Table of Contents to generated output

5. Working with (e)Wiki

5.1. Creating a (e)WikiModel

A (e)Wiki instance could be created either from Eclipse UI or programatically.

5.1.1. Creating from UI

Create a markup file within the eclipse workspace.

Right click the markup file and invoke eWiki -> Generate eWiki.

An (e)Wiki file is created in the same folder as selected file with extension .ewiki.

5.1.2. Creating with code

WikiText is based on “Builder” design pattern. (e)Wiki uses the same pattern and adds a new DocmentBuilder, the EWikiDocumentBuilder. The snippet below shows how to convert existing markup (Textile in this case) to (e)Wiki.


final String markup = "h1. Quisque"

final ResourceSet resourceSet = new ResourceSetImpl();
final Resource eWikiResource = resourceSet.createResource (URI.createFileURI("..."));

MarkupParser parser = new MarkupParser();
parser.setMarkupLanguage(new TextileLanguage());
EWikiDocumentBuilder eWikiDocumentBuilder = new EWikiDocumentBuilder(eWikiResource);
parser.setBuilder(eWikiDocumentBuilder);
parser.parse(markup);

eWikiResource.save(Collections.EMPTY_MAP);

The snippet below shows how to create a (e)WikiModel using model APIs. If you have worked with EMF generated model API code, this is no different, except that (e)Wiki adds additional convenient factory methods.

final ResourceSet resourceSet = new ResourceSetImpl();
final Resource eWikiResource = resourceSet.createResource (URI.createFileURI("..."));
Document document = EwikiFactory.eINSTANCE.createDocument();
document.setTitle("Lorem ipsum");

EwikiFactory.eINSTANCE.createHeading(document, 1, "Quisque");

Paragraph paragraph = EwikiFactory.eINSTANCE.createParagraph();
document.getSegments().add(paragraph);
EwikiFactory.eINSTANCE.createText(paragraph, "Lorem ipsum dolor sit amet, consectetur adipiscing elit.");

BullettedList bullettedList = EwikiFactory.eINSTANCE.createBullettedList();
document.getSegments().add(bullettedList);
EwikiFactory.eINSTANCE.createListItem(bullettedList,"Mauris");
EwikiFactory.eINSTANCE.createListItem(bullettedList,"Etiam");

eWikiResource.getContents().add(document);
eWikiResource.save(Collections.EMPTY_MAP);

5.2. (e)WikiEditor

The (e)WikiEditor provides a tree based editor and a preview tab.

5.2.1. Tree editor tab

The tree editor provides a tree view of the (e)Wiki file. Although the editor could be used to change the file, it is rare that rich text would be edited in this way.

5.2.2. Preview tab

The preview tab provides a preview of the (e)Wiki file as rendered by your default browser.

 

5.3. Generating output

The (e)Wiki file could be converted to HTML, Docbook, Textile and Redmine markup formats.

5.3.1. Generating output from UI

Right clicking the (e)Wiki file brings up the context menu for conversions.

5.3.2. Generating output from Code

You could use any of the existing DocumentBuilders in WikiText to generate output from a (e)WikiModel. The snippet below shows how to convert an (e)Wiki instance to HTML programatically using the WikiText HTMLDocumentBuilder.

final IFile eWikiFile = ...;
ResourceSet resourceSet = new ResourceSetImpl();
final Resource wikiResource = resourceSet.getResource(URI.createFileURI(eWikiFile.getRawLocation().toOSString()), true);
StringWriter writer = new StringWriter();

EWikiParser parser = new EWikiParser();
parser.setMarkupLanguage(new EWikiLanguage());
parser.setBuilder(new HTMLDocumentBuilder(writer));
parser.parse(wikiResource);

final String htmlContent = writer.toString();
Follow

Get every new post delivered to your Inbox.