Welcome back to the part 3 of my recap of Magento Imagine 2017! This time I’ll be covering the talks and presentations outside of the general sessions / keynotes.

First off though, if you missed either of the first 2 parts of the recap you check them out here:

Given the number of talks that took place at Imagine, for this article I’ll be covering talks from day 1 only.  I’ll do a further post to cover days 2 & 3 together and following that I’ll do a final post (or 2) on the community engagement events, such as MageTalk live, the diversity talk, the DevExchange and of course the parties, before ending with some final thoughts on the event.



One of the main attractions of going to any Magento conference is going to the talks and presentations provided by a wide range of clever people from the community and in some cases from Magento themselves.

Obviously I wasn’t able to attend all talks so I’ll only covering talks I was present at (or got the lowdown on from someone else) and given my background in development / tech the content covered will naturally be focused on these areas.


Omnichannel Solutions with Magento Commerce Order Management

The first talk I made it too was was the Magento Commerce Order Management (MCOM) talk by Mark Hatch, Director Product Delivery at Magento.

This talk provided technical overview and insight into Magento’s MCOM product, explaining where it generally fit alongside a Magento store (or stores) and other channels / services and the features it offers, such as inventory and order management as well as fulfilment all from a centralised hub.

Key features of MCOM:

  • SaaS based, hosted on AWS
  • Connectors for both Magento 1/2 are provided ‘out-of-the-box’
  • Support for multi-node inventory, or ‘aggregates’
  • Stock safety control to mitigate risk of ‘oversell’
  • Ability to batch process order sourcing, including from the best locations and schedule the times of the day these rules are implemented
  • Multiple fulfilment types covered, including drop shipping, click & collect, ship to store etc.
  • Flexible use cases, such as with external message based systems or integrated within another 3rd party tool
  • Manage cancellations / returns / refunds
  • Connect to external CRM e.g. Oracle Service Cloud
  • Payment gateway integration to allow for further transactions and refunds

Overall this was an interesting overview of the features MCOM provides and one of the key takeaways was the future plans to increase partner enablement by improving developer and QA training and resources as well as readying the product for the international market.

For more information see the slides here.


Expert Guidance on Migrating from Magento 1 to Magento 2

Next up I headed over to watch the talk from James Cowie and Gordon Knoppe from Magento’s ECG teaming regards to managing migration from Magento 1 to 2.

I consider myself a veteran of these talks now! Having seen multiple talks on the topic at events last year, such as Magento Live UK and Mage Titans MCR.


Gordon kicked off with a high level overview of the areas to consider, especially in regards to planning, such as considering which types of content to migrate and whether certain customisations and modules from Magento 1 are still relevant / required.  It was also noted one exception to migration (i.e. cannot be migrated) is themes.


James then took over with a lower level technical overview of both the data and code migration tools as well as some useful practical tips and advice.


Data Migration

The types of data that can be migrated with the tool are: store configuration, products, customers, orders and promotions and there are 3 modes that the tool can run in: store configuration settings transfer, data transfer and delta, which purely looks for changes since the last run.


James then provided a full overview of running the tool, how to map values and the common errors encountered.


Code Migration

This was as much as overview of the new technologies M2 brings over M1 and and James made it clear the code migration tool was purely an aid for migration, it won’t port your entire module and business logic automatically.


Finally, there were a few tips on deployment at the end of a migration process, plus one vital tip from James:


For more information see the slides here.


Everything Was UI Components & Nothing Hurt

Next on my agenda was Magento’s new Frontend Architect James Zetlen’s talk about all Magento 2 developers favourite new feature… UI Components.

It’s an understatement to say that reaction to Magento 2’s UI Components has been mixed at best, with even some of the most revered minds in the Magento community left scratching their heads when attempting to wrap their heads around the how and why.


James was on a mission to dispel that sentiment and provided his take on them providing some really insightful points to backup the reason UI Components should exist.

Firstly, that UI components are not too complicated, but just enough to be useful.


The value I took from this is that UI Components are an evolution of the block system (they do extend from them for one!) and their purpose is to segregate the frontend areas developers have to work with into reusable, modular components, the argument being that generally the amount of memory overhead a developer needs to remember everything on the frontend is currently too great.

This is not dissimilar from some of the forward thinking concepts in the backend community in regards to decoupling modules i.e. reducing their dependencies upon one another.

We we’re even treated to demo of a UI Component module James had built and an explanation that the large amount of code (or rather XML configuration) is somewhat important to reveal the complexity involved.

All the above sounds very positive, however whilst James gave a very comedic and convincing overview he did accept UI Components had some way to go before they would be the point that they should be. One of the main issues at present on the list to review is that creating and referencing UI Components within XML is far too verbose (as noted above) and James gave a good example of a potential solution to remedy this and reduce the repetitive nested elements.

Finally we we’re told that until further progress is made that UI Components created should only be within the admin.


Overall I found this a really engaging talk and whilst I’m still on the fence on this topic (mainly because I’ve yet to really have the time to invest in this area) I’m certainly more positive for the future of UI Components and will keep a keen eye on where they go from here. However I think the community at large still needs a lot more convincing! 😉


For more information see the slides here.


Performance Best Practices for Magento Enterprise Cloud Edition

The next talk on my list was a supposed technical deep dive on Enterprise Cloud Edition (ECE), which was hosted by Aaron Koch (Senior Cloud Solutions Architect at Magento) and Doug McIver (Senior Director, Product Management at Magento) along with representatives from ECE technology partners Fastly, New Relic and Blackfire.io.


Given the technology partners involvement, the content covered each of the areas they aid: distributed Varnish from Fastly, application performance monitoring (APM) from New Relic and code performance profiling / testing from Blackfire.

This talk on paper sounded like my kind of topic, but the overview for me was a little too high level and the Q&A wasn’t overly useful as the majority of the panel themselves stated they weren’t overly technical.

However there were a few useful points to take away.


For more information see the slides here.


The Future of Active Magento Cyber Security

Ok, so I didn’t attend this talk as it clashed with the above, however I feel it deserves a mention due to it’s potential importance, the reaction I saw from it on Twitter at the time and feedback from those that attended, including our MD Dave.

I won’t go into too much detail on the talk, you can reference the slides for that, however the key takeaway was that John Steel, Head of Information Security at Magento, announced ‘Magento Security Scan’ a tool for monitoring security on Magento stores.


This tool appears to very similar in nature to Mage Report but perhaps more in depth as can be set up with SSH access to the server of your Magento site and also offers automated scanning routines and detailed reports.


There was only one query I heard raised that I’m still unsure about, which is that given the scan can access your Magento stores server(s) via SSH it means the Security Scan host will poetntially contain SSH access to a large number of Magento stores in one place!

If you want to get involved, Magento is accepting requests for the beta programme – email securityinfo@magento.com to enrol.



The last talk of day 1 was hosted by Magento’s Director Of Community Engineering, Max Yekaterynenko and Lead Product Manager, Piotr Kaminski and provided an overview and updates on Magento’s technical resources for developers, the DevBox Docker environment and the recently formed Community Engineering Team.

First up Max gave an overview of the key resources available for developers that Magento now provide: Magento DevDocs, Magento Community Forums, Magento Tech Resources (user guides and release notes), Magento U and the Magento GitHub account.

Up next, Piotr gave a detailed overview of Magento’s DevBox to aid quickly provisioning local development environments via Docker and announced that soon these environments would also be able to be provisioned directly from Enterprise Cloud Edition instances. Again it was reiterated that DevBox was for use only in dev and not production, which has obviously been a discussion point since it’s original release given many peoples insistence that all Docker environments should be persistent through from development to production. It was also noted that Linux support and split containers are on the future roadmap.

Max then retook presentation duty and gave an overview of the newly created Community Engineering team at Magento that has been working hard to help increase activity and merging of pull requests on the Magento 2 GitHub repo, with some impressive stats shown.

Further to his Max outlined the team behind him on this journey as well as the ‘Community Gatekeepers’ that are now also helping with this process.

Overall I found this a really informative talk and gave great insight into the improvements Magento have been making in this area and more importantly provided some much need confidence to the community that things are moving in the right direction. One final takeaway from this talk was that for anyone working with Magento Enterprise and is interested in contributing, access to the EE GitHub repo can be provided on request.

For more information see the slides here.



Ok, so that covers all the talks I attended on day 1.  Now having recapped these myself I think there was a lot of useful content presented across all the talks, however personally I did initially (and still do to a certain extent) have an overall feeling that perhaps some of the talks / content were perhaps not as technical or in depth as I’d have perhaps liked, but I appreciate the amount of attendees, their backgrounds and levels of knowledge / experience means talks need to cater for a wide spectrum. As noted at the beginning, next up will be more of the same, but I’ll be focusing on the talks presented on day 2 and 3 of Imagine, so check back soon!