Friday, June 28, 2013

Cheating on Bucky & Is that a component behind that tree?

Cheating on Bucky

How do you go about learning something new? The intuitive approach is to read all you can grab about a topic. I tend to do this myself. In the case of buckminster the newbie is for sure not left in the dark. There is a free downloadable book and the wiki is undoubtedly one of the better wiki's covering an Eclipse technology topic.

Personally  I tend to get impatient to get going after some reading, and want to try out and get going for my needs. This is a pitfall, which can lead to long lasting trial-and-error sessions. One should really be able to recognize trial-and-error, step-back and say...oh now... I need to learn this thing before spending more time fiddling around. In the case of learning Bucky, this is exactly what happend, so I decided to create a Cheat Sheet, which would present the big bucky picture and at the same time hold sufficient details. As we all know, when writing and drawing, the best learning happens. The brain sucks it in like a sponge!

The first version can be found here , it's not been reviewed by a Buckminster authority, so please use it at your own risk!

Is that a component behind that tree?

Wow, what a relieve. I feel I master buckminster, and ready to build all components in the world. The actual setup of buckminster artifacts like cquery, rmap  etc... for NetXStudio , came with a different challenge, more precisely setting up the .rmap to locate components and getting the correct locator URL    came with surprises, I would like to share here.

This is my 3.x shopping list:

- Eclipse Platform + RCP ( I need the .ide plugin for various reasons).
- Eclipse EMF 2.something.
- Eclipse EMF/CDO 4.2 (And actually 4.0 first).
- Eclipse GEF/Draw2d/Zest
- Eclipse Nebula widgets.
- Eclipse Xtext 2.something.
- Eclipse Xpand, Xtend and MWE, tricky stuff, as these components are getting old..
- Google Collect/Inject
- Apache various libs, the usual suspects... (I am sure you know them).
- Apache POI for reading/writing certain spreadsheet formats.
- Javax stuff, like persistence API.
- SWT Chart, a nice and well documented charting widget.
-.... the ones I can't think of right now.

Here started my venture, I needed to walk through the forest of projects publishing there components, and pick the ones I needed. I also need to pick a component reader. Which is luckily very often p2 so it was a quick win. One question pops up all the time, why don't I download this thingy, push it up my Git repo, and then resolve it from my workspace? I could do that for the whole component list, above, but obviously this has some drawbacks:

  • I would keep a shadow of already managed and published components. 
  • I won't be continuously integrating. (Do I actually really want that?). 
So resisting the "self-push-into-github-components-I-need-temptation", I took out my torch, put on a helmet and went URL hunting. ( I recommend to bring a large cup of Cappuccino and don't forget to let friends and family know, you will be away for a while). 

Resolving Platform

For the platform, I found this:

http://wiki.eclipse.org/Eclipse_Project_Update_Sites

It has P2 sites for older releases, and the latest and greatest in flavour Milestone, Integration, Nightly
Note, not to confuse the Simultaneous releases hold the p2 repositories which bundle many Eclipse components in one single well-known p2 (i.e Juno, Indigo etc..).

Resolving EMF

Next up is EMF. Now EMF is rock solid. With great amusement, I follow the occasional attempt on the EMF forum to talk "it" into incorrect behavior. This usually fails, and rarely a fix is needed.
But how about that p2 URL, well sofar the main p2 advised from the download page is this:

http://download.eclipse.org/modeling/emf/updates/releases/

Now that's weird, pasting this p2 link into the IDE, I only see older EMF releases. In the case of EMF, Id' like to get the latest stable, let's say 2.8....but where is the p2 URL? I can download it as a whole, but no p2 can be located.... Perhaps I can tell bucky to link to the p2 in .zip format with this:

I dug around and wondered, with CI, there must be tons of projects requiring EMF, so how do they do it. The .rmap from CDO gave it away, here is the URL they use:

http://download.eclipse.org/modeling/emf/emf/updates/

This can be used in p2 in the IDE without problems. Now notice the fact that emf fragment is stated twice in the URL Where is this documented...? No answers for now, the URL can be specified in more granular form, by appending a release i.e "2.9-I-build" this would give you that specific integration build. At this very moment, I haven't figured out what the pattern is. Appending 2.8.1 for example is not a valid p2 URL. 


Resolving EMF Compare

Next up is the EMF Compare, now this is used by xtext to diff models, so depending on the Xtext version, there will be a dependency on EMF Compare version x.x. In my case, this was 1.2, unfortunately the packaging in 2.0 changed, so simply pointing to 2.0 Update site, didn't work. I had to point to an older release, luckily available in p2 format.

This is the URL, I ended up using:

Resolving M2T (Xpand/Xtend)

These buggers are available from the Model 2 Text project. In my case, I need some older releases, which are archived and not available in p2. (The zipped p2 URL isn't accepted as a valida repo, when pointing directly). The archive can be extruded on the server, but in this case, I felt more confortable to import the projects in my workspace, and commit to Git. (Yes, in this case, I didn't resist self-managed).

Resolving Xtext

This was pretty easy, at least that's what I thought.... The Xtext URL for p2 is this:


this will get you the latest Xtext, which is fine, however the generated Xtext editor has optional dependencies which have to do with Xtext builders and code generators. It ends up needing (optionally) the JDT , ltk, emf.codegen and more...., which I don't want in a runtime. Although Buckminster will recognize these as optional, if resolved it will also try to resolve the dependencies from the children, which might not succeed. This is when an advisor node in the .cquery comes in handy. Just skip the child components from optional plugin-dependencies.

There is also a Composite Release URL, which seems to contain everything from Itemis.
This is the one:

http://download.eclipse.org/modeling/tmf/xtext/updates/composite/releases/

I haven't tried, it but I think with the p2 reader, it's possible to point to specific categories of the p2 repo.

Done! well that's what I thought

So finally the sweet smell of success, all my components are resolved, and no more little red dots in the !

Give me more! 


What is left is tons of wishes, How do I add my unit tests?, can I produce the javadoc?, I actually want to publish the help as wiki, html and pdf... All those cool things... I am sure I'll find solutions for all these challenges in the time to come.....


Monday, April 29, 2013

Eclipse Modeling Framework - UI E4

I want to learn Eclipse 4 or e4 in short and I want to learn about Xtend as well. Then I figured, why not take the existing generated Eclipse 3.x EMF Editor and migrate to e4. It likely needs to adapt the templates which generate the Editor, so here I can try and use Xtend.

Get the result (Sofar)!
  • Grab E4MF here  (It's not done,  please log a bug on github for enhancements). 
If you wonder what the Eclipse EMF Editor is, then you should read about it here, and I will tell as little as it allows you to generated a fully functional editor, for whatever defined EMF model.

The Conclusion

Migrating a fully functional multipage editor which has interacts with other views to e4 is not a trivial task. First of all it's about clean-up. Dependency Injection is very powerful the new e4 plartform provides a very usefull implementation for these concepts.

The Approach:

The e4 platform is really 'different' to say the least. It's been out for a couple of years now, and there is sufficient information around to learn from and try to actually achieve my goal, but still there are a some fundamentally different concepts.

I list a few of them here:
  1. An e4 UI application is first constructed as an application model. So here is an Application.e4xmi with it's own editor to define the application structure. 
  2. Dependency Injection is all over. Parts are POJO's so do not (or limit) extending or implementing interfaces. If services are needed, these will be injected. There are plenty of Examples.
  3. The application ui model and actual renderering is de-coupled. This means, that the rendering can be exchanged, from i.e. SWT to JavaFX or Swing or whatever. 
  4. all org.eclipse.ui.* plugins are deprecated as 3.x platform plugins. This is an important fact to consider in the migration. 

My approach is to take the EMF extended library example, generate the editor plugins. there is typically three of them.

org.eclipse.emf.examples.library.edit
org.eclipse.emf.examples.library.editor
org.eclipse.emf.examples.library.test

...and then make a copy of these, and make them working e4 editor. This will form the target to generate, so the templates which generate from the EMF .genmodel files.

.genmodel options

Now this is a bit tricky. The .genmodel file,  as various options, which will add/remove functionality from the .edit and .editor plugins. In my approach, I will use the default, and worry about the optional features later. I know, this 'could' get me in trouble, but I want to get started, and don't feel like figuring out, what is the most complete set to make the most functional EMF editor.

The only option I set here is to generate an RCP version of the editor.
This has some consequences as we will see later on.

Finding dependencies

We know for a pure e4 product, org.eclipse.ui plugins should not be used anymore . So we need an overview or map of where these dependencies exist. What I am going to do is remove these dependencies, see what breaks and replace it with the e4 alternative.

The generated editor has the following dependencies.

org.eclipse.emf.examples.library.editor
=> org.eclipse.core.runtime
=> org.eclipse.emf.examples.library.edit
=> org.eclipse.emf.ecore.xmi
=> org.eclipse.emf.edit.ui

The first two will need some rework, the latter two are not UI plugins so these are OK.

org.eclipse.emf.examples.library.edit
=> org.eclipse.core.runtime
=> org.eclipse.emf.edit
=> org.eclipse.emf.examples.library

After inspecting the dependencies, of org.eclipse.emf.edit, I am surprised to see there are no UI dependencies. The .edit plugin there for requires no adaptation, although it provides the base implementations for the command and adaptation patterns used in EMF.

org.eclipse.emf.edit.ui
=> org.eclipse.core.runtime
=> org.eclipse.ui.views
=> org.eclipse.ui.workbench
=> org.eclipse.emf.edit
=> org.eclipse.emf.common.ui
=> org.eclipse.core.resources (Optional)
=> org.eclipse.ui.ide (Optional)
=> org.eclipse.jface.text (Optional) 

So what needs to be adapted, as we stated earlier, the org.eclipse.ui.* plugins have been replaced in e4. In our case the following plugin dependencies are affected:
=> org.eclipse.ui.views
=> org.eclipse.ui.workbench
=> org.eclipse.ui.ide (Optional)

Also the EMF common.ui plugin as dependencies on the Eclipse 3.x UI framework.

org.eclipse.emf.common.ui 
=> org.eclipse.core.runtime
=> org.eclipse.ui
=> org.eclipse.emf.common
=> org.eclipse.core.resources (Optional)
=> org.eclipse.ui.ide (Optional)
=> org.eclipse.jface.text (Optional)

Here only org.eclipse.ui is a dependency we haven't seen before, so this will migrate as well.  
 
I conclude, that the base UI plugin from EMF will need to be reworked. I decide to clone EMF, rename this plugins (As they will need to co-exist if the EMF team wishes to pull from me). The new name is:

org.eclipse.e4mf.edit.ui
org.eclipse.e4mf.common.ui 

Now my instinct, would tell me to break the 3.x  dependencies here, and fix them with the e4 alternative, so I decide to do so, but before I go into, this I would like a basic EMF e4 editor working
so I start here:


org.eclipse.emf.examples.library.editor

The approach is to drill down top down, the application structure. So we start with the Application, then the Perspective, then editors and views, then Actions etc....

Step 1. Fix the dependencies, which have changed with the e4 versions of

org.eclipse.e4mf.edit.ui
org.eclipse.e4mf.common.ui

Step 2. Create an e4 Application Model.

File -> New -> Eclipse 4 -> Model -> New Application Model
This creates a file named Application.e4xmi

[TRICK] You can instruct e4 to load the application model from a specific location, with the following property in the Application extension definition:

 <property
               name="applicationXMI"
               value="org.eclipse.emf.examples.library.e4editor/xmi/Application.e4xmi">
</property>

Here we tell e4 to load the model from a subdir 'xmi' , this is also where we keep e4 fragments.

Step 3.  Populating Application Model

Here we migrate the generated EMF editor functionality into the application model.
In e4 We need a pespective stack and the actual perspective. 

PerspectiveStack => ID: org.eclipse.emf.examples.library.e4editor.perspectivestack.0 (Generated)
Perspective => ID: org.eclipse.emf.examples.library.e4editor.perspective.0

Now in the EMF editor, we have the actual model editor on the left, and the outline and properties editor on the right. So we need PartSashContainers, PartStacks and Parts for this. Here are the ID. 

PartSashContainer => ID: org.eclipse.emf.examples.library.e4editor.partsashcontainer
      PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.editor
      PartSashContainer => ID: org.eclipse.emf.examples.library.e4editor.partsashcontainer.1
           PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.0
                   Part => ID: org.eclipse.emf.examples.library.e4editor.part.1
           PartStack => ID: org.eclipse.emf.examples.library.e4editor.partstack.1
                   Part => ID:  org.eclipse.emf.examples.library.e4editor.part.2


As we will see later on, we would actually like a PartDescriptor for opening the editor, unfortunately as of writing , this is not supported.

Maximize and Minimize

The base functionality, doesn't have the maximize and minimize buttons for a PartStack.
Adding the plugin:  org.eclipse.e4.ui.workbench.addons.swt to the launch config. will make some additional e4 addons available. One of them is the MinMax addon.

[TRICK]
With e4 it's possible to show the runtime application model editor. This is especially usefull when
working with fragments and processors. (Application model contributions).

To do this, include the following plugins in the launch config or product or other build system.
  • org.eclipse.e4.tools.emf.liveeditor 
  • org.eclipse.e4.tools.emf.ui 
  • org.eclipse.e4.tools.emf.ui.script.js
  • (Additional required)
When available, the following key combo can be used ALT-SHIFT-F9 to bring up the liveeditor.
[BUG] on Macosx it's not working, with current (Kepler M6)
https://bugs.eclipse.org/bugs/show_bug.cgi?id=394503

Step 4. The EMF Editor

In e4, an editor is a POJO, so doesn't extend EditorPart, nor a MultiPageEditor Part.
The EMF editor has several functionalities. In order to rebuild them one by one, we first disable:
  • Doesn't extend MultiPageEditorPart 
  • Overriding methods. (We are a POJO!).
  • Implement various e4 concepts.
Constructing the UI

In e4 we can designate any method and add e4 lifecycle annotations like @PostConstruct which will call the method when reaching the time in a part or other UIElement lifecycle. On top we can add method arguments, which will be injected (if available) by the context.

For our EMF Editor it starts with the init method.

simply doing this, gives us a parent composite to add our views to.

@PostConstruct
public void init(Composite parent, ......[more arguments to follow]){
   createPages()
}

We call createPages(), this is normally called by our MultiPageEditorPart, as there is no equivalent for MPEP , we settle adapting the method createPages() to only create one page.

ViewerPane

Viewer panes have a title bar and functionality to maximize/restore the viewerpane.
The concept is however very tight to IWorkbenchPart and IWorkbenchPage. The widget is actually placed inside a so-called ViewerForm which allows control of layout and margins etc...
One other feature will loose from ViewerPane, is that the title is updated with the selected object
from the view pane.

For now we decide, not to migrate this concept and create the viewers directly under the parent composite.
 

Init the Editor input

the init(IEditorSite site, IEditorInput input) method for a 3.x should find it's equivalent
in e4.  The 3.x definiton in the plugin.xml starts with <extension point="org.eclipse.ui.editors">. It allows us to specify the implementation, an icon and a contributor to the class and more...

The EMF Editor in 3.x generates a content type parser and a model specific content type
which are used to respectively parse the content and associate a file type by extension with the EMF generated editor.

Unfortunately e4 currently doesn't have a part descriptor which resembles the 3.x equivalent. There is an MInputPart and it can be put in the Application Model, but it will be static. What we want is to query the framework for editor descriptors which match a criteria like a file name or protocol. This concept is recognized and named EditorPartDescriptor. An extension to the e4 Application model to support this is available in the simple ID demo (Tom Shindle).

Some learnings from the demo:
  • It shows how an input from one MPart (Navigator) is set to the context with context.set(IFile.class, f);  and later inject in a Handler's @Execute method arguments. The context in this case is the IEclipseContext. 
  • The contributed EditorPartDescriptors for .xml .text and .java editors are checked for their supported extensions. If a match is found the e4 CommandService is use to fire a command, which will involen the OpenEditor Handler, which then creates an MInputPart and sets the contributionURI according to the EditorPartDescriptor to make sure the correct editor is contributed and instantiated.
  •  It demonstrates how the input URI of the MInputPart is adapted to an IDocument which the editor can consume. The adaptation is done with a so-called ContextFunction. The adapter implements org.eclipse.e4.core.contexts.IContextFunction which is offered as an OSGI service.

[DECISION] We do not implement this concept, as it requires an extension to the e4 Application model. See further the "Open Handler" for the actual chosen implementation.

Modularity and Contributions in e4

We could add a model fragment, but this will also be static. It is intended to provide extensibility but does not substitute for the old extension point *.editors.

The best possible solution for now is:

1. Implement an Open Handler
2. Create an MInputPart Programmaticly
3. Set the Contribution URI to our own EMF Editor
4. Set the Input URI for the MInputPart
5. Activate the part with the EPartService

Dealing with Dialogs -getShell()

In 3.x we need do getSite().getShell() to get the active shell for the part. In the EMF editor this happens as well. The alternative in e4 is this.

    @Inject
    @Named(IServiceConstants.ACTIVE_SHELL)
    private Shell activeShell;

So replace all invokations of getSite().getShell() with activeShell

Dirtyness

In 3.x an EditPart implements ISaveablePart which is the interface for marking the editor dirty (Dirty meaning it's been edited and should be saved, the user is notified with an * next to the title of the part). In order for the workbench to know about a dirty editor, we had to call firePropertyChange(IEditorPart.PROP_DIRTY);

In e4 we have the this alternative. 

@Inject
private MDirtyable dirtyable;

Whenever a part is dirty we have to call

dirtyable.setDirty(false/true);

In the case of the EMF Editor, this happens on the CommandStack Listener and when the editor is saved.

Saving is done by implementing the @Persist annotation to several methods:

doSave(IProgressMonitor monitor)
doSaveAs() [TODO, how would this work? It needs to be bind to an action, perhaps needs @Execute instead of @Persist]


Adapters

The adapter concept is also supported by e4. In classical 3.x a part is consulted (adapted) for certain interfaces and returns an implementation if supported. For the EMF Editor the following is adapted:

  • IContentOutlinePage.class
  • IPropertySheetPage.class
Now we have a bit of an issue here as both these classes are not available as e4 implementations.
However considering the problem these classes try to solve. Like dealing with selection etc...It is very different in e4.

The editor also listens for part change, and activates the EMF editor whener the property page or outline becomes active and is related to the EMF Editor.

Properties

The Properties concept is not implemented in e4, as we aim for pure e4 (Not compat layer).
See https://bugs.eclipse.org/bugs/show_bug.cgi?id=404884

[DECISION] Defined MPart placeholder, for keybinding to work.

Outline

The outline concept is implemented pure e4 in the simpleIDE demo (Tom Shindl).

[TODO] consider adopting this concept.

[DECISION] Defined MPart placeholder, for keybinding to work. 

Step 5. Actionsets and Actions.

The 3.x. EMF editor generates two action sets. One for the editor and one for the model.

ActionsSet.1 
  • About
  • Open URI
  • Open
ActionSet.2
  • Model (New)

ActionSets in 3.x are used to group actions which belong to a certain task usually represented in a certain perspective. Binding of ActionSets and Perspectives is done with the extension org.eclipse.ui.actionSetPartAssociations

Now besides the fact that ActionSets are even deprecated in the 3.x platform (Use Commands and Handlers instead), the generated EMF Editor actually only has one perspective as an RCP app and doesn't bind the action sets to the defined perspective.

Other Actions

The EMF Editor also creates various menus programmatically and adds global actions to the 3.x ActionBarAdvisor. This is:

File Menu

File -> [FILE_START]
            New -> [MB_ADDITIONS]*
            ----- ID: org.eclipse.emf.examples.library.e4editor.menuseparator.file.additions
            [MB_ADDITIONS]
            -----
            Close
            Close All
            -----
            Save
            Save As
            Save All
            ------
            Quit
            [FILE_END]

* The contributions between brackets [...] are markers for dynamic insertion in 3.x. For e4, the insertion points are ID's of other items. Menu separators can be pre-inserted in the Application. Fragments can then contribute to these.

Example: In our case we define a menu separator:  
org.eclipse.emf.examples.library.e4editor.menuseparator.file.additions

Model fragments use these ID's to contribute 'before' or 'after' as we will see later on. 

Creating the structure in e4 Commands, Handlers

The equivalent for e4 is to simply add Commands, Handlers and Key bindings to the Application Model for most actions and inserting the Application Model contributions in the right place using ID's of UIModel Elements.

Command ID's

[BUG?] In the documentation, it is stated that commonly used commands should use the ID's as known in IWorkbenchCommandConstants. However using this in some cases, causes the a menu or toolbar not being shown. For example:

For the About command, we should use: "org.eclipse.ui.help.aboutAction". This causes the menu entry 'About' not to show.

Handlers

Note that the Handler implementations are interresting, as these would use DI to get relevant objects like the workbench, or the Active part etc....

Most of the Handlers are straight forward. We discuss here some of the specific ones.

OpenHandler

The Open handler re-uses the functionality already in the 3.x EMF Editor. Like the methods to open a dialog and select a file based on extensions.

The function is hower exposed to dependency injection with a HandlerSupport services made available with OSGI. The service provides facilities to open an Editor, open a File Dialog, respecting the EMF Model file extension and more. See HandlerSupport

[TODO] Considering the dynamic nature the e4 Application model, we can close the partstack holding the EMF editor. So fix the Open/New handlers to cope with an un-existing partstack.

Key Bindings

An initial e4 Application model doesn't have any binding context associated.
A default binding context hierarchy will be defined as:

Binding Context - Window and Dialog (Applies to both)
             |
              - Binding Context Window
              - Binding Context Dialog

Binding Tables

Here we bind specific keys with commands for a Window or Dialog contex. See further which keys are associated with a context (via the Binding Table).
 

The following key-bindings are specified in EMF Editor

M1 Ctrl Command
M2 Shift Shift
M3 Alt Alt
M4 Undefined Ctrl

Declarative:

M1+U => Open URI command
M1+O => Open command

Declarative through EABC (Implict by use of Platform Actions).

M1+W => Close
M1+M2+W => Close All
M1+S => Save
M1+M2+S=> Save All
M1+Q=>Quit

Dynamic Contributions, the IEditorActionBarContributor

The 3.x EMF Editor mimics the Menu and Toolbar structure of the Eclipse IDE. Actions which are specific for the IDE (New, Open, Open URI and all edit actions), are contributed by respectively in a declarative manner in plugin.xml and by an IEditorActionBarContributor.

The solution for e4 requires some rework, as the contribution paradigm is different for e4.

Contributing the equivalent of 3.x Actions in plugin.xml is done by creating  fragments, which is added to our application. There will be two fragments. One for the EMF Editor, and one contributed by org.eclipse.e4mf.edit.ui. The reason is that these actions could be contributed to an IDE instead of an RCP Application.

[TODO] The contribution however is hardcoded in the fragment to specific. 

Notes on position in list: (I state it here, as it was not documented in most tutorials).

first
index:$theindex$
before:$theotherelementsid$
after:$theotherelementsid$
 
For the Editor Contributor, as there is no EditorPartDescriptor with associated contributor, we need to mimic the functionality with dynamic contributions.

The contribution occurs only when the editor is active, for e4 our designated MPart for the editor is active. The contribution is part of both the generated editor and emf.edit.ui

To achieve this, we hook into the e4 Event system. The Editor will check if an activated part's ID is the EMF Editor. If so, it will set a context variable.

[BUG]Unfortunately there is a bug: https://bugs.eclipse.org/bugs/show_bug.cgi?id=400217

Edit Menu

The Edit menu structure is

Edit
Undo   => M1+Z
Redo   => M1+M2+Z
----
Cut      => M1+X
Copy   => M1+C
Paste   => M1+V
----
Delete => Del
Select All => M1+A
[ADD_EXT]
[EDIT_END]
[MB_ADDITIONS]


Model Menu

EXTLibrary Editor
----[settings]
----[actions]
New Child -> [Containment children for selection]
New Sibblings -> [Sibblings for selection] 
Validate
Control...
---- [additions] 
Load Resource...
--- [additions_end]Refresh
Show Properties
--- [ui-actions]

Migrating the Actions

Most of the actions are part of org.eclipse.e4mf.edit.ui and build on JFace IAction. The IAction and Action implementation as such are not incompatible to e4. However the e4 workbench doesn't accept IAction's. [There is the Compat layer, with MRenderedMenu, but we go for pure e4 here]

EXTLibraryModelWizard

Fixing the wizard is required, although wizards are JFace only implementation, which is not exactly true. The generated Model Wizard for the EMF Editor also requires the implementation of an INewWizard.

Now why is this in the first place? Well this is really to contribute to the wizard to the Workspace
when running in IDE mode.

[DECISION] Remove 'implement INewWizard'

As a consequence, we also need to fix how the editor will be openend, but as we refactored this for the OpenHandler into a HandlerSupport Service, we simply call this method and we have no dependency on the 3.x IWorkbench.

Additionally, we would like out Model Wizard to be part of the injection context.
to achieve this, we simply add the following annotation to the class definition.

@Creatable

Now, a new instance will be created by the e4 DI, whenever we refer to the EXTLibraryModelWizard class in our Handler constructor for the 'New' command.

org.eclipse.e4mf.common.ui 

Step 1. Renaming the packages.
Step 2. I remove the dependency on org.eclipse.ui and org.eclipse.ui.ide, which of course breaks a lot of stuff. 

org.eclipse.ui
=> org.eclipse.swt
=> org.eclipse.jface
=> org.eclipse.ui.workbench

.swt and .jface are re-exported, which causes the common.ui to break.
Now, as rendering is decoupled from the UI model and .swt and .jface on top, it makes sense to let the current common.ui plugin to be just one of this rendering implementation. Later on, we could have a common plugin, which is alternative to SWT.  For sake of not over-complicating, I don't separate the UI model part from the rendering (SWT) just yet, but it would be required. 

So, I add the following dependencies:
=> org.eclipse.swt
=> org.eclipse.jface

Looking I wat is not resolved, this is more or less what I expect.
Things like EditorPart, Memento, PlatformUI, AbstractUIPlugin etc... These 'services' are all done differently in e4, so I get to work on these one by one :-)

Step 3. EclipseUIPlugin =>

Now this extends the AbstractUIPlugin, for which services in e4 offer similar functionality for Preferences, Dialog settings and accessing Resources like images.

[DECISION] For now it's best to let EclipseUIPlugin extend Plugin instead of AbstractUIPlugin. This means afore mentioned services will need to be provided the e4 way.

Step 4. Diagnostic Component =>

This class requires the Shared images normally available from the PlatformUI. We don't have the PlatformUI in e4, so this should be migrated by using an inject resource service.

After some research, the 3.x SharedImages are not Exposed as resources with the IResourcePool concept of e4. (part of tools.services). For EMF, I decide to create such an Resource Provider, or more explicitly a provider for the workbench images.

Note that for 3.x the images are registered on an ImageRegistery with the class WorkbenchImages.
The actual images are stored in org.eclipse.ui.

See this bug for the solution. : https://bugs.eclipse.org/bugs/show_bug.cgi?id=404727


Workspace Stuff =>

One of the functionalities of EMF Editors, is the ability to interact with the workspace. The workspace is

.....TODO Continue migration of emf.common.ui


org.eclipse.e4mf.edit.ui



Step 1. Renaming the packages
Step 2. Extended Image Registry

Fix the fall back to PlatformUI for getting images.

Wednesday, April 3, 2013

Eclipse 4 Injecting a Resource Pool

Objective: Migrate of Eclipse 3.x shared images to e4

Requirements:
e4 Tooling (Eclipse Download)
e4 Tooling (Vogella Download) 

Example: The example can be obtained here
Level: Intermediate Basic e4, Understanding of OSGI and DI in e4, General Eclipse, RCP experience

Everything about Eclipse 4 is different than programming against the Eclipse 3.x. One example of this is about obtaining resources like images, colors and fonts.  Eclipse 4 is very good in exposing OSGI Services through Dependency injection and a service for getting resources is already defined in the e4 tooling.

To make this very concrete, here is an example of how to register resources which can be obtained through a so called IResourcePool.

What I needed was access to the shared images from the 3.x framework.
Typically these resources would be obtained with

ISharedImages sharedImages = PlatformUI.getWorkbench().getSharedImages();
Image img = sharedImages.getImage(ISharedImages.IMG_OBJS_ERROR_TSK);

In e4 there is no PlatformUI singleton and as far as I investigated the images in IShatedImages are not available if pure e4 is chosen. So how do we deal with this? Well it turns out, the e4 tooling has a service which can be implemted to make resources available. How does it work?

The solution is like this:

First, define an OSGI service which looks like this:


<?xml version="1.0" encoding="UTF-8"?>
<scr:component xmlns:scr="http://www.osgi.org/xmlns/scr/v1.1.0" immediate="true" name="org.eclipse.e4.tools.resources.workbenchresourcess">
   <implementation class="org.eclipse.e4.tools.resources.WorkbenchResourceProvider"/>
      <service>
      <provide interface="org.eclipse.e4.tools.services.IResourceProviderService"/>
   </service>
   <properties entry="OSGI-INF/resources.properties"/>
</scr:component>


This file should be stored in a folder named OSGI-INF under the plugin root.
What it does.

  1. First it tells us that this service is an implementation of IResourceProviderService. This service is defined in org.eclipse.e4.tools.services so this plugin should be available through e4 tooling.
  2. The implementation of the service is named WorkbenchResourceProvider and looks like below. This class extends a convenient implementation, also from the e4 tooling named BasicResourceProvider. Our implementation only needs to define keys (static strings) which the service will use to find the resources. 
  3. The resources are found by binding the keys to a location in a file named resources.properties, which is also defined in the OSGI component. 
WorkbenchResourceProvider (This an extract, the example includes all keys which are in ISharedImages)

public class WorkbenchResourceProvider extends BasicResourceProvider {
 
 /**
     * Identifies the error overlay image.
     * @since 3.4
     */
    public final static String IMG_DEC_FIELD_ERROR = "IMG_DEC_FIELD_ERROR"; //$NON-NLS-1$

    /**
     * Identifies the warning overlay image.
     * @since 3.4
     */
    public final static String IMG_DEC_FIELD_WARNING = "IMG_DEC_FIELD_WARNING"; //$NON-NLS-1$

    /**
     * Identifies the default image used for views.
     */
    public final static String IMG_DEF_VIEW = "IMG_DEF_VIEW"; //$NON-NLS-1$

resources.properties
IMG_OBJS_ERROR_TSK=/icons/full/obj16/error_tsk.gif
IMG_OBJS_INFO_TSK=/icons/full/obj16/info_tsk.gif
IMG_OBJS_WARN_TSK=/icons/full/obj16/warn_tsk.gif

The example includes all images which are packaged with the plugin org.eclipse.ui
 Note: Not all keys are implemented in the resource.properties file. Please add missing keys at will.

How to use this service:

  1. Include the resources plugin as part of an rcp product or launch configuration. 
  2. Make sure the plugin is autostarted and that the start level is 0 or 1 so the service is available before the app tries to access it. 
  3. To use it in a class do the following: 

 @Inject
IResourcePool poolOfResources;
img = poolOfResources.getImageUnchecked(WorkbenchResourceProvider.IMG_OBJS_ERROR_TSK);


Perhaps the implementation of the resource service could be smarter and find the images by types as explained here this is also how the 3.x images are registered, but for now this works very well.


The example can be obtained here

Have fun!

What if it doesn't work

Very likely the OSGI service is not available when consulted. Make sure the service is available.
Note: Services can be consulted from the OSGI console.


What is the future of this

I don't know, but I think the resources from 3.x should be available for e4 apps. Follow the bug here

Friday, March 29, 2013

Migrating Instiki to Rails 3.2

Objective: Migration of Instiki to Rails 3.2 

Instiki is a great wiki system, which I am using to drive the content of http://www.netxforge.com.  Instiki however is behind in respect of the Rails version it was created with. This is a problem for me, as I want to integrate instiki, into a broader Ruby On Rails application. 

This blog describes the steps I took to do it. The conversion is unfortunately not available as a download or on a repo somewhere, but I hope that the list of issues (And how I solved them) can help others to repeat it and make the upgrade available. 

The creator of Instiki, alias Distler has endorsed this initiative in the instiki forum.

So here are the steps: 

Warning: some of the stuff is perhaps a bit specific to my setup so be carefull. Also I haven't done everything yet, and some things don't work correctly. One example is the use of JavaScript. In rails 3, JQuery is favoured over Prototype.js etc... , the use of assets and separation of Javascript from erb templates is good practise. Some of this is still to be done. 


----- (I Recommend to start the server after each step, to get a feel of the progress)


Step 1. Use a rails 3.2 generated app to compare the current instiki and the rails 3.2 structure. We refer to this as the Template Rails App (TRA)

Step 1.1 Update the Gemfile (It now looks like this, see explanation in separate steps)

source "http://rubygems.org"

gem "rails", "=3.2.12"

# Gems used only for assets and not required
# in production environments by default.
group :assets do
  gem 'sass-rails',   '~> 3.2.3'
  gem 'coffee-rails', '~> 3.2.1'

  # See https://github.com/sstephenson/execjs#readme for more supported runtimes
  # gem 'therubyracer', :platforms => :ruby

  gem 'uglifier', '>= 1.0.3'
end

gem 'jquery-rails'

gem "sqlite3", :require => "sqlite3"
gem "itextomml", ">=1.4.10"
gem "rack", ">=1.4.5"
gem "mongrel", ">=1.2.0.pre2"
gem "rubyzip"
gem "RedCloth", ">=4.0.0"
gem "erubis"
gem "nokogiri"
gem "rake"
gem "rdoc"
gem "json"
gem "file_signature", :git => 'http://github.com/distler/file_signature.git'
gem "maruku", :git => 'http://github.com/distler/maruku.git', :branch => 'nokogiri'
# gem "mysql2"

Step 2. Replace the /script's content with the TRA's /scripts content, delete the old files from /script

Step 3. Migration of the /config folder

Step 3.1 Create application.rb (Rails 3.  has an application.rb file) and configure it in the steps below.
   
    Load the /lib folder  as this is turned off, edit/add these params.

config.autoload_paths << "#{Rails.root}/lib"
config.autoload_paths << "#{Rails.root}/lib/chunks"
    Do some java script setup, for the new rails 3 assets concept. Instiki should actually migrate away from scriptaculous.js and use JQuery
        
config.action_view.javascript_expansions[:legacy] = %w(prototype.js scriptaculous.js)

Step 3.3 copy in the environments/ and initializers/ from TRA     

The TRA application name will be different then what Instiki should be. The following file should be edited and the     first line should be renamed from 

"TemplateApp::Application.configure do" to "InstikiApp::Application.configure do"
 
config/environments/development.rb     
config/environments/test.rb     
config/environments/production.rb 


Step 3.4 move the original environment.rb out of the way (renamed it to environment.rb.backup), and copy the environment.rb from the template rails app   

Some migrations from the contents of this file.         

Rename the last line to InstikiApp::Application.initialize!     

require_dependency 'instiki_errors' => Moved this to a custom initializer named config/initializers/instiki_init.rb It looks like this: 
 
require 'instiki_errors'
require 'wiki_content'
        

require 'instiki_errors' # migrated from instiki environment.rb     
require 'wiki_content' # Needed to load properly         [TODO]     

rexml_versions => Not sure what to do with this. It scans various directories to get an REXML version.     
# Miscellaneous monkey patches (here be dragons ...)     

require 'caching_stuff'     
require 'logging_stuff'     require 'rack_stuff'                Note: Not using require_dependency, as this is undocumented (Rails internal) and for development only. Not really a requirement here     I believe. 

Step 3.4 Copy in boot.rb from TRA, remove preinitializer.rb which is a pre Rails 3 hack to get bundler working. 

See: http://gembundler.com/v1.3/rails23.html 

Step 3.5 root.rb     [Carefull] This route.rb is slightly specific to my application, but it includes the wiki routes, so you can extract the relevant ones:
 
[TODO] some routes don't work in the rails 3. instill, need to be fixed.
 
def connect_to_web(generic_path, generic_routing_options, *options)
  if defined? DEFAULT_WEB
    explicit_path = generic_path.gsub(/:web\/?/, '') # Strip the /:web
    explicit_routing_options = generic_routing_options.merge(:web => DEFAULT_WEB)
    match explicit_path, explicit_routing_options
  end

  match generic_path, generic_routing_options
# map.connect(generic_path, generic_routing_options)
end

# :id's can be arbitrary junk
id_regexp = /.+/

InstikiApp::Application.routes.draw do

# SEE:  http://yehudakatz.com/2009/12/26/the-rails-3-router-rack-it-up/

  root :to => 'public#page', :id => 'HomePage'

  # Wiki Admin:

  match 'create_system', :to => 'admin#create_system'
  match 'create_web', :to => 'admin#create_web'
  match 'delete_web', :to => 'admin#delete_web'
  match 'delete_files', :to => 'admin#delete_files'
  match 'web_list', :to => 'wiki#web_list'

  # Application
  match ':controller/:action(/:id)'

  # Wiki webs routing
  connect_to_web ':web/edit_web',  :to => 'admin#edit_web' #Edit an arbitrary web.
  connect_to_web ':web/remove_orphaned_pages',  :to => 'admin#remove_orphaned_pages' #Remove pages which are not referenced by any other page
  connect_to_web ':web/remove_orphaned_pages_in_category',  :to => 'admin#remove_orphaned_pages_in_category'
  connect_to_web ':web/file/delete/:id',  :to => 'file#delete', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/files/pngs/:id',  :to => 'file#blahtex_png', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/files/:id',  :to => 'file#file', :constraints => {:id => /[-._\w]+/}, :id => nil
  connect_to_web ':web/file_list/:sort_order',  :to => 'wiki#file_list', :sort_order => nil
  connect_to_web ':web/import/:id',  :to => 'file#import'
  connect_to_web ':web/login',  :to => 'wiki#login'
  connect_to_web ':web/web_list',  :to => 'wiki#web_list'
  connect_to_web ':web/show/diff/:id', :to => 'wiki#show', :mode => 'diff', :requirements => {:id => id_regexp}
  connect_to_web ':web/revision/diff/:id/:rev',  :to => 'wiki#revision', :mode => 'diff', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/revision/:id/:rev',  :to => 'wiki#revision', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/source/:id/:rev', :to => 'wiki#source', :constraints => { :rev => /\d+/, :id => id_regexp}
  connect_to_web ':web/list/:category',  :to => 'wiki#list', :constraints => { :category => /.*/}, :category => nil
  connect_to_web ':web/recently_revised/:category',  :to => 'wiki#recently_revised', :requirements => { :category => /.*/}, :category => nil
  connect_to_web ':web/:action/:id',  :to => 'wiki', :constraints => {:id => id_regexp}
  connect_to_web ':web/:action', :to =>  'wiki'
  connect_to_web ':web',  :to => 'wiki#index' 

Step 4 problem with plugin: protect_forms_from_spam, comment it out. Find an alternative

Step 5. Deal with assets (Stylesheets, Javascript, Images)

    Read this: http://guides.rubyonrails.org/asset_pipeline.html
   
    Assets should be pre-compiled with:
        bundle exec rake assets:precompile
    these will end up in the /public/assets/ folder

Step 5.1 Update the Gemfile to include:

    # Gems used only for assets and not required
    # in production environments by default.
    group :assets do
      gem 'sass-rails',   '~> 3.2.3'
      gem 'coffee-rails', '~> 3.2.1'

      # See https://github.com/sstephenson/execjs#readme for more supported runtimes
      # gem 'therubyracer', :platforms => :ruby

      gem 'uglifier', '>= 1.0.3'
    end

Step 5.2 Create asset folders
   
    /app/assets/stylsheet

        - copy in application.css from TRA
        - [OPTIONAL] rename .css to .css.erb to use assets in CSS for example: <%= asset_path 'someimage.png' %>        

    /app/assets/javascript

        - application.js is auto created, but we already have an application.js file with a bit of scripts in it. so copy the following lines
        from TRA and add them to the instiki application.js
       
        // This is a manifest file that'll be compiled into application.js, which will include all the files
        // listed below.
        //
        // Any JavaScript/Coffee file within this directory, lib/assets/javascripts, vendor/assets/javascripts,
        // or vendor/assets/javascripts of plugins, if any, can be referenced here using a relative path.
        //
        // It's not advisable to add code directly here, but if you do, it'll appear at the bottom of the
        // the compiled file.
        //
        // WARNING: THE FIRST BLANK LINE MARKS THE END OF WHAT'S TO BE PROCESSED, ANY BLANK LINE SHOULD
        // GO AFTER THE REQUIRES BELOW.
        //
        //= require jquery
        //= require jquery_ujs
        //= require_tree .

    /app/assets/images
        - use image_tag helper methods.

Step 5.3
   
    Copy in the assets from /public in the respective locations under /aspp/assets from the 2.x instiki

Step 6 Replace @controller with controller wherever this occurs in the controllers.

Step 7. Fix issues with rendering and default.layout/
    [DEBUG]: Render an action without the layout for troubleshooting: render :layout => false
    fix the layout content insertion point, was @content_for_layout versus modern yield. so => <%= yield %>

    Use: <%= debug params %>

Step 8. problem with sublayout: (Wiki source)
   
    Showing /Users/Christophe/Documents/Spaces/netxforge_aptana/com.netxforge.store/app/views/layouts/application.html.erb where line #46 raised:

    undefined method `sub_layout' for #<WikiController:0x007ffb07b5bf58>

Step 9. link_to_remote issues: , now workaround, not using the :update tag. [TODO]

    Currently URL's like this are generated, which embeds a js into the URL with onclick statement).

    <a onclick="new Ajax.Updater('intropage', '/public/page/features?menu=true&partial=true',
    {asynchronous:true, evalScripts:true}); return false;" href="#"></a>

    :update is not supported anymore in Rails 3. (AJAX call back to update a DOM id), see the following articles:

    http://www.simonecarletti.com/blog/2010/06/unobtrusive-javascript-technique/

    New approach is based on separation of the HTML and JS, so the JS should do the update.

Step 10. Issues with form_tag

    in template edit.html.erb, the form tag starts with '<%' erb shabang, but should be '<%='
    replacing this fixed the problem.

    See 3.0 Release notes, section 7.4.2.

    Note: The included javascript should be moved to the application.js or another .js under /app/assets/javascript

Step 11. ActiveModel::MassAssignmentSecurity erros on various model objects[SOLVED]
    because of this:
   
    https://gist.github.com/peternixey/1978249

    in various model objects, to fix these errors.

    Page =>  attr_accessible :locked_at, :locked_by, :name
    Revision=> attr_accessible :revised_at, :page, :content, :author
    .....

Step 12. Undefined method error in WikiContent[SOLVED]
    It turned out, that
    include ChunkManager (in wiki_content.rb) didbn't load properly, as
    ChunkManager has dependencies on various other classes in /chunks.
    Made sure these are loaded in application.rb (See Step 3.1)

    however this causes another issue:

Step 12.1 Error, when include /lib/chunks
    Expected .../lib/chunks/wiki.rb to define Wiki
    Actually the whole app now fails with different errors:

    This could be a conflict in naming of classes, as /chunks defines a wiki.rb
    See this post:
    http://stackoverflow.com/questions/10948779/expected-to-define-when-calling-class-inside-a-module

    fixed by:

    - renamed wiki to wiki_c and references to it.
    - removed call to html_safe in WikiContent.render! as this points to a method which
    will produce a ActiveSupport::SafeBuffer instance from the WikiContent, so not adhering to the instance type,
    this gived method_errors

Step 13. URL generator in wiki gives wrong urls for: [TODO]
    all Pages => /:web/list/HomePage (So the :page is appended and shou'd not).
    edit Web => /:web/edit_web/HomePage (:page appended, should not).

Step 14. Dealing with XML templates.
    renamed atom.rxml to atom.builder simply solved the problem.

Step 15. Grab a coffee, and reflect on your great achievements sofar :-)

Step 16. Warning: You don't need to install rails_xss as a plugin for Rails 3 and after. [TODO]
   
    http://simianarmy.com/post/11117853564/upgrading-to-rails-3-rails-xss
    - What to do with this? It's not clear to me what the new situation is.

Step 17. [2013-03-13 12:26:00] WARN  Could not determine content-length of response body. Set content-length of the response or set Response#chunked = true

    Caused by webrick server, don't worry about it.


Thursday, December 27, 2012

Rolling the Calendar

For NetXStudio , I needed a function which would split a period into sub-periods. The function needs to be generic, so I can pass in a Calendar field which would then split the Period object according to field value.

First some definitions:
  • The DateTimeRange object is a simple Java Object with set and get methods for two Date fields named 'begin' and 'end'. 
  • Day start is defined at 00:00h and the day ends at 23.59h (Yes there is one lost minute but OK for me). 
The Java Calendar  or better the GregorianCalendar is capable to roll forward and backward as a real Calendar would do. The method roll(...) is used for that. The Calendar can be forced into a specific position using the set(...) method.

My method to deal with at least Calendar.MONTH and Calendar.DAY_OF_MONTH ended using the method getActualMinimum(...) and getActualMaximum(...) to set the Calendar to the desired position. The desired positions are:
  • Calendar.MONTH => The first day of the current month. 
  • Calendar.DAY_OF_MONTH => The first hour of the day. 
Now to roll into these positions, I need to know the 'child' Calendar field, to be used in the getActualMin/Max() methods. By 'child' field I mean the following mapping:
  • Calendar.MONTH => Calendar.DAY_OF_MONTH
  • Calendar.DAY_OF_MONTH => Calendar.HOUR_OF_DAY
Calling getActualMinimum(Calendar.DAY_OF_MONTH) and setting this as the new Calendar position, would set the Calendar to the first day of the current month of the Calendar. Calling getActualMinimum(Calendar.HOUR_OF_DAY) and setting this on the Calendar would set it at the first hour of the day.

All fine... The Unit test on a 6 month period which would be asked to split it in first MONTH_OF_YEAR, and subsequently DAY_OF_MONTH generated the intended result nicely.

Full: From: 27-06-2012 @ 00:00 To: 27-12-2012 @ 00:00
 Month: From: 01-12-2012 @ 00:00 To: 27-12-2012 @ 23:59
   Day: From: 27-12-2012 @ 00:00 To: 27-12-2012 @ 23:59
   Day: From: 26-12-2012 @ 00:00 To: 26-12-2012 @ 23:59
   Day: From: 25-12-2012 @ 00:00 To: 25-12-2012 @ 23:59
   Day: From: 24-12-2012 @ 00:00 To: 24-12-2012 @ 23:59
   Day: From: 23-12-2012 @ 00:00 To: 23-12-2012 @ 23:59
   Day: From: 22-12-2012 @ 00:00 To: 22-12-2012 @ 23:59
   Day: From: 21-12-2012 @ 00:00 To: 21-12-2012 @ 23:59

Being confident about my code, I expanded it by adding an option to split the Period in weeks. This is were trouble started. First I had to pick the 'child' field for the day of the week for the 'parent' field  Calendar.WEEK_OF_YEAR.

I quickly choose to use Calendar.DAY_OF_WEEK for this, makes sense right? Well that's what I initially thought. The result for setting the Calendar to the getActualMinimum(Calendar.DAY_OF_WEEK) actually didn't set the Calendar position to what I expected. I took me a while to figure out why. The reason is best explained with the following diagram:




In the diagram, the Calendar is set to today (Which happens to be the publication date of this blog-post, but this purely a coincidence, trust me on this). Now calling getActualMinimum with my freshly developed method for Calendar.DAY_OF_MONTH and Calendar.HOUR_OF_DAY works as expected. the Calendar rolls to the intended position. For Calendar.DAY_OF_WEEK) however, it doesn't. It actually roles forward! What I really expected was the new Calendar position to be the previous Sunday at least. Which was not even correct, as it should be the Monday for my Local.

mmmh... It triggered my curiosity (Ok, first there was a few moments of programmers frustration..). Why does it do that. I read about Calendar and noticed a bit of documentation on Calendar field conflicts. Suddenly I realized I should not use the actualMinimum for rolling the week day. What I should use is getFirstDayOfWeek() and set this on the calendar.

Ok, but this means that depending on the field, my method would need to act differently, which was not my initial goal. Sofar I haven't found another solution with the current GregorianCalendar capabilities. I even had to implement a function getLastDayOfWeek(Calendar cal) to set the end boundary of the week.  I encourage any reader to comment and let me know a better solution. (Note: I know about date/time libraries, but without them, can you make this better?).

Here is the code for the final solution:

public List periods(DateTimeRange dtr, int calField) {

  boolean weekTreatment = false;

  int childField = -1;
  switch (calField) {
  case Calendar.MONTH: {
   childField = Calendar.DAY_OF_MONTH;
  }
   break;
  case Calendar.DAY_OF_MONTH: {
   childField = Calendar.HOUR_OF_DAY;
  }
   break;
  case Calendar.WEEK_OF_YEAR: {
   childField = Calendar.DAY_OF_WEEK;
   weekTreatment = true;
  }
   break;
  }

  List result = Lists.newArrayList();

  if (childField == -1) {
   result.add(dtr);
   return result;
  }

  final Calendar cal = GregorianCalendar.getInstance();
  cal.setTime(dtr.getEnd().toGregorianCalendar().getTime());

  // An end calendar to compare the calendar field, and not take the field
  // maximum but the value from the end calendar.
  final Calendar endCal = GregorianCalendar.getInstance();
  endCal.setTime(dtr.getEnd().toGregorianCalendar().getTime());

  // Go back in time and create a new DateTime Range.
  do {

   // Set the begin to the actual minimum and end to the actual
   // maximum, except at the start, where we keep the actual.
   // At the end, roll one beyond the minimum to set the new actual.
   if (cal.get(calField) != endCal.get(calField)) {
    if (weekTreatment) {
     // :-( there is no method to get the last day of week.
     cal.set(childField, getLastDayOfWeek(cal));

    } else {
     cal.set(childField, cal.getActualMaximum(childField));
    }

   }

   final Date end = cal.getTime();

   // Special Treatment for Week
   if (weekTreatment) {
    final int firstDayOfWeek = cal.getFirstDayOfWeek();
    cal.set(Calendar.DAY_OF_WEEK, firstDayOfWeek);
   } else {
    int minimum = cal.getActualMinimum(childField);
    cal.set(childField, minimum);
   }

   Date begin;
   if (cal.getTime().getTime() < dtr.getBegin().toGregorianCalendar()
     .getTimeInMillis()) {
    begin = this.fromXMLDate(dtr.getBegin());
   } else {
    begin = cal.getTime();
   }

   final DateTimeRange period = period(this.adjustToDayStart(begin),
     this.adjustToDayEnd(end));
   result.add(period);

   // Role back one more, so the new actual can be applied.
   // This will cause the
   cal.add(calField, -1);
  } while (cal.getTime().getTime() > dtr.getBegin().toGregorianCalendar()
    .getTimeInMillis());

  return result;

 }


Notes:

  1. Usage of Google Collect to produce collections.
  2. The DateTimeRange type holds a 'begin' and 'end' member fields. 
  3. The period(Date begin, Date end) is a factory for an instance of of type DateTimeRange. 
  4. The method ignores the beginning of the period, which precedes the intended boundary.
  5. The method getLastDayOfWeek(Calendar cal) looks like this:

public int getLastDayOfWeek(Calendar cal) {
  final int firstDayOfWeek = cal.getFirstDayOfWeek();

  final int lastDayOfWeek;
  if (firstDayOfWeek != 1) {
   lastDayOfWeek = firstDayOfWeek - 1; // One before the first day...
  } else {
   lastDayOfWeek = cal.getActualMaximum(Calendar.DAY_OF_WEEK); // Expect
  }
  return lastDayOfWeek;
 }


Thursday, November 22, 2012

Lazy loading CDO backed JFace Viewer.

This post is about achieving a good user experience in a User Interface while presenting a potentially large set of data which is retrieved from a back-end system. In our case the UI is an Eclipse RCP Application, which uses JFace Viewers and the Back-end system is Connected Data Objects, in short, CDO. CDO is an object-persistence middleware system based on the Eclipse Modeling Framework (EMF).

Objective

The objective is to load data in a UI Widget "Just in time". In this case it means, whenever data becomes visible. The Eclipse JFace library as such a facility for scrolling through a TableViewer or TreeViewer. For this to work the following conditions need to be met:
  1. The Content Provider needs to be an ILazyContentProvider for TableViewers and a ILazyTreeContentProvider for TreeViewers. 
  2. As data is fetched dynamically the content provider is not knowledgeable about the number of items in the viewer. Therefor it is required to call setItemCount(...) on the viewer. 
  3. The TableViewer should be created with the SWT.VIRTUAL style to signal that the viewer should be updated only when data becomes visible.
An example of such a content provider implementation could look like this
public class LazyListContentProvider implements ILazyContentProvider {
 
 private Viewer viewer;
 private List content;

 public void dispose() {

 }

 public void inputChanged(Viewer viewer, Object oldInput, Object newInput) {
  this.viewer = viewer;
  this.content = (List) newInput;
 }

 public void updateElement(int index) {

  if (viewer instanceof TableViewer) {
   ((AbstractTableViewer) viewer).replace(content.get(index), index);
  } else {
   throw new UnsupportedOperationException(
     "Only table viewer is supported");
  }
}


Here updateElement(int index) will be called, whenever more items are needed which stems from the use scrolling the viewer down and revealing more rows.

The code for the table viewer would something like this

tblViewer = new TableViewer(frmTolerances.getBody(),
SWT.BORDER | SWT.MULTI | SWT.FULL_SELECTION |SWT.VIRTUAL| widgetStyle);

tblViewer.setItemCount(((CDOResource)resource)     .eContents().size());

tblViewer.setContentProvider(new LazyListContentProvider());
tblViewer.setLabelProvider(....any label provider...);
tblViewer.setInput(toleranceResource.getContents());

Filters, Sorting and History


Now there is one limitation with using an ILazyContentProvider. It's not capable to deal with the sorting and filtering capabilities of a JFace Viewer. Why not? Well, the viewer doesn't have access to to the complete set of items, and as both Sorting and Filtering require the full set of items, it won't work.

However there is a potential solution to enable these functions, and if you are a user of Eclipse, the chance is you use this already on a daily basis. The same dilema exited for finding types in the IDE and other search operations which require large sets of data to be presented, filtered, remembered and sorted. The implementation of this function is the FilteredItemSelectionDialog. Now if we look at this class we notice that it complies to all requirements for Lazy loading data. It uses a content provider which implements the ILazyContentProvider. It acts on a TableViewer with flag SWT.Virtual and sets the item count. The really nifty thing however is all the additional facilities available as well.

As an example and teaser here is a screenshot of a view (In this case in an Eclipse Form) uses the base class referred further down.




Filtering 

The concept of filtering in the FilteredItemSelectionDialog should be decomposed, to reflect what happens when filtering.

The dialog has a text / search entry widget which allows a filtering pattern to be applied on the data set. The way it works is by adding items to a dedicated filtered items collection which match the filter criteria. For this the Content Provider needs to be fed with items when realized. As the Dialog is abstract, a concrete implementation will implement the methods:
@Override
  protected void fillContentProvider(
    AbstractContentProvider contentProvider,
    ItemsFilter itemsFilter, IProgressMonitor progressMonitor)
    throws CoreException {
  ...
  }

 @Override
  protected ItemsFilter createFilter() {
   return null;
...
  }

The concrete implementation method will feed the content provider with the data, considering the provided filter. Note that the filter is applied already when realizing the Dialog. the method which initiates activities is applyFilter() . It is also called when the pattern matching input text changes.

applyFilter() will invoke a sequence of background activities, which use the Eclipse Job API. The sequence is:

(1) applyFilter()
           |___ (1a) createFilter()
           |
(2) FilterHistoryJob -- (4) RefreshCacheJob -- (5) RefreshJob
           |
(3) FilterJob -- (4) RefreshCacheJob -- (5) RefreshJob
         
(6) RefreshProgressMessageJob

1a ItemsFilter

The ItemsFilter is abstract and should be implemented by Clients. A typical implementation will extract a relevant item attribute like the "name" of the item and match it against the filter.
This is as easy as calling the .matches(String) method on the ItemsFilter class.

Note that the ItemsFilter can be instantiated with a custom SearchPattern(int rule) class. The SearchPattern can hold various matching rules like exact matching or pattern matching and case sensitive matching.

2 FilterHistoryJob

First, calls the createFilter() method, which will call the concrete implementation. Next it will invoke a background Job which populates the items with a potential history. (Still considering the filter). This job is named: FilterHistoryJob. The history is managed by an abstract SelectionHistory class, which clients should extend. The purpose the ability to customize the serialization to XML for the items which need to be remembered. Items can be added to the History as we wish. In the case of the FilteredItemSelectionDialog, selected items are remembered when OK is pressed. This will actually only happen when the SelectionHistory has been set using:


.setSelectionHistory(SelectionHistory);


In our implementation, As we deal with CDO Objects, we use the unique CDOID as a serialization option.

It will already refresh the viewer with the result from History if any and if the filter matches the history item,  but will also as a last step invoke the next Job which is the FilterJob.

3 FilterJob

The FilterJob will do the actual filtering based on the current ItemsFilter as created in the first step. This is delegated to the method filterContent() in the job.

4 RefreshCacheJob

The RefreshCacheJob will refresh the viewer with the result of the filtered items in two steps. First the cache is is refreshed by the procedure described here:

The filtered items is a result collection of several actions:

  1. First the current items are sorted first using a Client specified Comparator. 
  2. Next a potential additional ViewerFilter is applied. The ViewerFilter(s) can be defined by calling addFilter(ViewerFilter).  
  3. Finally a Separator is inserted in the cache between the Historical items and any potential item not in history which as previously made it through the Item filter and the Additional ViewerFilter(s) 

The second and last step is to invoke another job which is the RefreshJob

5 RefreshJob

This is a UIThread Job as it is actually refreshing the TableViewer. As we use a ILazyContentProvider, what happens is to set the item count on the table viewer with method setItemCount(). Also the selection is saved and restored after the TableViewer refresh. Note that a refresh will trigger the ILazyContentProvider update method, which will in turn replace the viewer items with the cached items. 

6 RefreshProgressMessageJob

This job refreshes the loading progress, it's cyclical so it schedules itself every 500 milliseconds while the progress is not cancelled or done. 

Applying the method on a lazy loading 

So now we understand the FilteredItemSelectionDialog inner workings, we can apply this same technique to a UI Component which is not a Dialog, but perhaps a ViewerPart which presents CDO Data.  The steps to do so are:

  1. Extract the relevant code from FilteredItemSelectionDialog
  2. Change the algorithm for populating the initial items list. 

Refactor FilteredItemSelectionDialog 

There is very various things we want to do:

  • Remove Dialog specifics 

Here we want to remove all the Dialog specific stuff, and refactor this is as a "regular" Eclipse UI Component. Various aspects like Saving Dialog Settings, and handling button selection are not required so can be removed. 
  • Support for multiple columns in the TableViewer
Also we want to support multiple columns. For this we need to provide a hook for clients to create the columns, additionally we want the default implementation of the Label Provider. (Which is very sophisticated), to also support multiple columns. This is done by supporting the ITableLabelProvider


With a bit of work, you could get an abstract like this: AbstractLazyTableViewer

This is merely an example, but it is self-contained and should work for clients extending it. 
Note 1: The refactored version assumes it will be based in an Form. 

Algorithm for populating the inital items list

The implementation will, without adaptations, populate the entire content in the viewer, if a pattern like "?" is entered. Although an ILazyContentProvider is used, it's not acting entirely as we might expect. What happens is that the method fillContentProvider expect the following method to be called:

public void add(Object item, ItemsFilter itemsFilter)

Now that's Ok for a data source which is local, but in case of a remote CDO Repository, it forces hard labour instead of laziness. So we need to come up with a different solution. Unfortunately I am running out of time now.. so perhaps thought for another post. 

Tuesday, January 3, 2012

Xtext Connected scoping - CDO and Xtext 2.0

Just last year I ventured into integrating an expression language which can be used to perform operations on parts of an EMF model. The building blocks were decided to be Xtext and CDO, both based on EMF but not interworking out of the box.

There are likely several ways in which Xtext and CDO could integrate. The serialization of an Xtext model into a CDO resource could be thought of. What I needed was different. I wanted to be able to execute expressions on objects stored in a CDO repository. The solution: We would use a regular Xtext resource, but would need to reference other objects stored in CDO. The serialized version of our Xtext resource would need to be stored somewhere as well. For this I decided to simply store it as a String in one of the features of CDO model.


In the diagram here, it shows the idea.



Model A and B are Ecore models which have been adapted to work with CDO. In order for EMF models to benefit from the CDO capabilities, CDO can convert an EMF .genmodel so it can generate CDO optimized java.

The Script.xtext is an Xtext grammar, which implements an expression DSL,  It does arithmetic operations like + - * /.

The Script.xtext grammar includes import statements for model A and B, so the grammar rules reference these external models in the defined grammar expressions. Xtext terminology for these references is a "cross-link".

In Xtext referencing external model objects is a well known pattern which is documented in the great  Xtext Documentation. I won't go into how this is done exactly, but it boils down to telling the workflow where to find the A and B model.

The cross-links pattern, is also explained in the documentation. This is the interesting part. The formal grammar for cross links is:

CrossReference :
  '[' type=ReferencedEClass ('|' terminal=CrossReferenceTerminal)? ']'
;

In our Script.xtext we use cross-links linking to the external models A and B in similar fashion.
Behind the scene Xtext will resolve the links in the so-called linking process. By default Xtext generates a linker named LazyLinker, which works just fine with CDO based resources. the second step in linking is the definition of the linking semantics. The Xtext linker is lazy, it links the object when needed. The proxy URI which is created and set. An Xtext proxy uri looks like this.


"xtext resource.extension"#xtextLink_::0.3.0.0::3::/4


See the class LazyURIEncoder [src] for details on the coding technique. Next the Xtext Scoping API comes in to play and has been the focus of most of my efforts.

In Xtext scoping is either local or global. The local scope is from the own grammar definition, and global is scoping from external models. We are especially interested in the IGlobalScopeProvider. It is the scope provider's responsibility to return an IScope for a given EReference, in our case a cross-link to either model  A or B.

Xtext ships with two GlobalScopeProvider implementations, one is based on explicitly referencing resources by a grammar naming convention. This is the ImportUriGlobalScopeProvider. The other is the DefaultGlobalScopeProvider, which leverages the Java class path to find resources.

What I needed was a GlobalScopeProvider which was none of these two, but would rather use a fixed set of CDO resources as the base to determine the applicable IScope. The CDO resources holding objects from package A and B, were well known to me and resolvable by a URI, so I looked for a way to use  the CDO resource URI as a base for building the IScope. The set of CDO resource URI's in my case is fixed, so I could simply hardcode the URI's in the IGlobalScopeProvider.

With the highly customizable nature of Xtext, it was rather straighforward to replace/enhance the following components.

IResourceServiceProvider

Here I simply enhanced the DefaultResourceServiceProvider to override the method canHande(URI uri)
This method needs to support CDO URI's which look like this:

cdo://"repo name"/folder/resource

In my implementation, I look for a URI scheme starting with "cdo".
The CDO  implementation of the ServiceProvider can be installed in the xxxRuntimeModule Guice class.

public Class<? extends IResourceServiceProvider> bindIResourceServiceProvider() {

   return CDOResourceServiceProvider.class;

}

So this service provider can now handle CDO URI's, next the actual GlobalScopeProvider is needed.


IGlobalScopeProvider



In the CDOGlobalScopeScopeProvider we need to return an IScope based on the requested context.
Our implementation uses the EClass of the referenced type in package A or B, to load the CDO resource.

When initializing the CDOGlobalScopeProvider we build a map of EClass to CDO resource URI(s), which is queried when an IScope is requested. The target EClass is derived from the EReference which is part of the API call for the IGlobalScopeProvider as seen in this method signature.

IScope getScope(Resource context, EReference reference, Predicate<IEObjectDescription> filter);

when the CDO URI is retrieved from the Map, we then invoke (from our customCDO IResourceDescriptions) :

IResourceDescription description = descriptions.getResourceDescription(uri);

IResourceDescriptions

As we don't want to load the CDO resource each time an IScope is requested, we use an index for the IResourceDescription produced by the IResourceDescription.Manager, in our IResourceDescriptions implementation.

Xtext comes with a SimpleCache implementation which can be used for that. Additionally we want the index to be updated when the CDO models change, so I borrowed some of the Dawn CDO listeners, boiling dawn to an implementation of the following method:


public void handleViewInvalidationEvent(CDOViewInvalidationEvent event);


from this, I could tell the CDO URI / IResourceDescription index to clean the URI entry for the CDO event dirty objects or any other objects invalidated by the event. The next time an IScope would be requested for such a URI, the cache would simply rebuild the IResourceDescription

Conclusions so far. 

I am aware this blog doesn't provide the details and actual implementation for people to re-use. The reason is that the code is pretty specific for my CDO models, hence my implementation of the CDOGlobalScopeProvider map which tells me in which CDO URI, an EClass resides is really build for my CDO model and not re-usable in a generic way.


private Map<EClass, List<URI>> eClassToURIMap;

Additionally, I have some custom code to open a CDO session and view, to actually retrieve the CDO resource, this won't be re-usable directly in any other app.

I still wanted to share this experience, and simply state CDO and Xtext work great together! I am well inclined to share more details to anyone interested to learn from this experience.

Next steps

It's pretty cool to commit an object in CDO and seconds later see it appear at code completion in an Xtext editor! however.

  1. For larger resources, building the IResourceDescription is time consuming, and it will take some time to build. It becomes important how CDO Resources are stuffed with objects of certain types. In my experience, at some point CDO resources have to be chopped in CDO folders etc... to make them smaller units which can be worked with. 
  2. related to 1), the index should actually be build in application idle time, and ready for use whenever I want to link or get a proposal for a CDO cross-refed object from Xtext.  A solution could be a background job doing this, similar to the Xtext builders which do this in background. 
  3. I find it's needed to sometimes to limit the scope of exposed CDO objects by a certain context. I haven't figured out exactly if this should be done in the ProposalProvider, by the generated DeclarativeScoping or in the CDO global scope provider. I would appreciate some advise on this. 
  4. Perhaps with some help of the Xtext and CDO guys, we could turn this into an off-the-shelf Xtext fragment, I am personally a bit disappointed by the lack of focus of Xtext for runtime RCP apps, although the way it's build up with Guice, makes it super easy to replace or leave out capabilities.