In this interview, we had the pleasure of speaking with Dr. Brian Langelier, Facilities Manager at the Canadian Centre for Electron Microscopy (CCEM) at McMaster University. As Canada’s largest and most advanced electron microscopy facility, CCEM generates vast amounts of complex data across a wide range of research projects. Dr. Langelier shared his perspective on the growing importance of effective data management in large research environments, particularly in transmission electron microscopy (TEM), where datasets continue to expand in both size and complexity. As the size and complexity of TEM data streams increase, there is a growing emphasis on utilizing machine-actionability to find, access, interoperate, and reuse data, a practice known as the FAIR principles. Software solutions, such as AXON Synchronicity, can help streamline the complex workflows that are needed in capturing, managing and analyzing in situ TEM results.
Can you tell us a little about your background and how you got into the field of microscopy?
Sure, my background is actually in mechanical engineering. During my PhD I drifted into materials science, working with light metals and that is when I first got into microscopy. I was doing my PhD at the University of Waterloo, but since McMaster and CCEM had advanced equipment close by, I was able to come down here quite a bit to use the equipment and get some training. That is where I got my first exposure to TEM.
During my PhD, I also had the chance to spend some time at the University of Sydney, working with Simon Ringer’s group on atom probe tomography. At that time, atom probe not yet established in Canada. Collaboration with an institute with atom probe knowledge was a great opportunity. And then, by coincidence, right when I finished my PhD, CCEM installed Canada’s first atom probe. The scientific director at the time, Gianluigi Botton, knew me from my visits and knew I had experience with atom probe, which was rare in Canada, and he invited me to join as a postdoc. That eventually turned into a staff position running the atom probe and supporting FIB and TEM projects.
During the pandemic, the facility was expanding and there were some management changes, which opened up the facility manager position. I applied and have been in that role since 2020. These days, my responsibilities are mostly around infrastructure planning, making sure the equipment and services are maintained, and looking at what is next in terms of acquisitions. I also lead a team of about 15 technical staff, a really skilled group with various backgrounds, and they are a big part of what allows us to deliver high-quality training and support to our users.
McMaster University’s CCEM has become a hub for advanced electron microscopy projects. From your perspective, what makes CCEM unique in the way it supports such a diverse range of research activities?
CCEM is a unique in that it is a national research platform based here at McMaster, with one of the most diverse suites of electron, ion, and x-ray microscopy tools in the country. I would say our strength really comes down to three things: capability, connectivity, and culture.
On the capability side, we have got instrumentation that covers everything from imaging millimetre-scale structures all the way down to single atoms. Connectivity comes from our broad user base; over six hundred researchers a year from academia, industry, and government and we try to support them through structured partnerships and collaborations. And then culture is a big one; we really emphasize training, education, inclusiveness, and knowledge sharing, and that is driven by our incredible technical staff.
At the same time, we are not a fully funded centre, so part of what makes us unique is also how we operate. We rely on a mix of baseline funding from the university and government, but also user contributions and industry partnerships. That means we are constantly working to balance access, manage training, and make sure people have what they need – and that we need to work on improving not just the instruments, but also the data processing, storage, and transfer support that comes with modern microscopy.
And of course, this is a big team effort. We have different people for different roles, including a manager who leads our education and outreach programs, a strong admin team that keeps the financial and operational side running, and our scientific director, Dr. Nabil Bassim, who provides overall leadership. This means we operate in a very collaborative environment, which is what allows CCEM to support such a wide range of research activities.
CCEM works with a wide range of users, from academic researchers to private companies. How do you engage with industry partners and encourage them to collaborate with the Centre?
We often talk about outreach in two ways: one being business development, and the other being education and public engagement. On the business side, our outreach is closely tied to our financial stewardship as a facility. A part of our role is to bring in industry partners who need access to advanced electron microscopy and the expertise that comes with it. Those projects help sustain the centre and, at the same time, create great opportunities for collaboration and help various companies leverage our capabilities for their R&D.
And then, on the broader outreach side, we do quite a bit to promote microscopy and materials science more generally. We run workshops, participate in conferences, and do a lot of community outreach. The latter includes things like public open houses, school visits, and even STEM camps where kids can actually try out microscopy. So overall, our engagement spans from industrial collaboration to public education, and I think that balance is part of what keeps CCEM so dynamic. We are also really excited to be hosting the Atom Probe and Microscopy Conference in 2027, which will be a fantastic opportunity to bring the community together.
With so many different projects under one roof, how do you approach managing the growing volumes of microscopy data generated daily?
This is definitely one of the biggest challenges for a facility like ours. We manage data through a tiered framework; first it is collected on the microscope workstations, then transferred to our internal CCEM Data Server for short- to medium-term storage and processing. For long-term archiving, that responsibility usually shifts to the research groups themselves.
Even with that system, the scale is becoming harder to manage. We are especially seeing real bottlenecks in 4D-STEM, in situ TEM, and X-ray Computed Tomography (CT), where datasets can be enormous and the transfer times alone are a challenge. Building the right balance between accessibility, performance, and cost is something we are still actively working on, and it is an area that will only grow in importance as experiments become more data-intensive.
And when it comes to managing all of that information, who actually owns the data? Is it handled by each project or student individually, or is there a more centralized system in place?
CCEM’s policy is that independent users manage their own datasets, but we provide our own data server as a free and secure space for short-term storage, transfer, and collaboration. For projects run by our technical staff, the data are handled centrally under our Data Management and Cybersecurity Framework, ensuring everything is stored consistently and safely. Even though we don’t own the data, we treat it as strictly private, because users, especially academic researchers, may have agreements or funding requirements that limit access. That approach balances user autonomy, security, and facility-wide integrity, and it works for both academic and industry partners.
This approach gives us the flexibility to support academic users, who often want open collaboration, while also meeting the needs of industry partners, who may have confidentiality or IP concerns. It is a balance between autonomy and consistency, and it is what allows such a diverse user base to work effectively within one facility.
What do you see as the biggest challenges in handling and organizing large-scale TEM datasets across multiple research groups? And how do you typically provide reports or outputs to your users?
There are a few key challenges. Experiments like 4D-STEM, X-ray CT and in situ TEM, can generate enormous datasets very quickly. Next to this, different instruments and proprietary file formats make it tricky to ensure interoperability, long-term preservation, and smooth sharing among collaborators. Lastly, not everyone has a strong computational background, so we often need to provide curated workflows and training to make the data meaningful and usable.
As for reporting, it really depends on the user. Some people want a formal report, sometimes in a PowerPoint deck or similar format. But many of our biggest industry users already have their own microscopes, workflows, and staff to process the data. They come to CCEM because our instruments can do something beyond what they have in-house, but they don’t need us to generate reports or do post-processing. They just want the raw data so they can analyze it themselves. For others, especially academic users or those less familiar with microscopy, we do provide processed data and detailed methods. So there isn’t really a standard; we try to stay flexible and provide what the user actually needs without adding unnecessary costs.
Has AXON helped relieve some of this sharing of data?
AXON fits well with CCEM’s focus on knowledge-centric workflows, especially for datasets that can be difficult to process. We have seen it work particularly well where a user might want to visualize a dataset themselves. Even if they do not do their own processing, having a free software means we can share the software with collaborators so they can explore and analyze the data on their own.
In another effort to streamline workflows, we have set up virtual workstations for academic users. This means that they can securely access software and process data from anywhere, just as they would with a local workstation. The idea is that access to post-processing tools becomes part of the user program, alongside access to instruments. This means that the software doesn’t need to be shared, but we can share the entire workstation through a server-hosted virtualized environment, with a limited amount of licenses. This means that it is safe, secure and also accessible.
In your opinion: are there any important data integrations still missing to make EM data management easier and more streamlined?
There are definitely a few areas where improvements could make a big difference. One is standardization across vendors. Right now, proprietary file formats make interoperability and long-term preservation difficult. Another is better linkage between the Laboratory Information Management System (LIMS), data acquisition, and compute platforms. At CCEM, we manage separate systems: our data server for storage, and the laboratory information system for bookings, billing, and user access. Each system requires its own administration, and it would be really helpful if these could be blended together.
Do you see any trends with funding requirements where a solid data management strategy needs to be in place as part of the proposal?
Data management and cybersecurity policies are definitely part of facility grants here in Canada, and granting agencies are increasingly looking for them. What exactly they require can depend on the type of data. For example, anything tied to individuals, like health data, will have stricter requirements.
More broadly, there is also a growing focus on research security. This goes beyond cybersecurity and touches on who has access to data and infrastructure. In an age where open science can sometimes be exploited, funding agencies want to see that researchers are thinking carefully about how data are shared and who can access it, and that they have a plan to manage it responsibly.
How do you envision the future of data management in TEM evolving, especially as new forms of automated and synchronized data capture become available?
I think the future is really about better integration and automation. Ideally, instruments would natively capture complete metadata streams, and systems like LIMS, compute platforms, and cloud storage would merge into unified platforms. Every TEM is essentially its own laboratory, capable of many different modes and experiments. What I would love to see is data management that reflects that versatility, so that you do not have a completely different workflow for 4D-STEM versus brightfield imaging, EDS, in situ, or STEM experiments. Each has unique requirements, of course, but ideally a system could handle all of that variation seamlessly.
That said, it may be a huge challenge to build a system that works for everything. Even something as simple as having all the microscope computers running on a single monitor is difficult. So, while the vision is clear, automation, integration, and security, the path to get there may still be gradual.
What advice would you give to facilities or researchers who are just starting to think more strategically about how they handle and preserve their microscopy data?
Do the boring homework. Facilities are founded by scientists, and we all want to dive straight into the exciting science. But it is really important to get your facility governance, data management, and user access policies in place before they become problems. Planning ahead is much easier than trying to rebuild systems later as you grow.
Think of it like city planning. If you wait until your facility is too big, it is much harder to install the infrastructure you need. But if you plan for growth from the start, everything becomes smoother. This doesn’t just apply to data management, it includes laboratory information management software, access control, and security policies.
Finally, it’s about training, longevity, and balance: empower staff and students with data literacy, consider retention and archiving, and balance openness with security. Having these systems in place also helps when you engage with industry partners or other collaborator. You can show them how you work, rather than improvising on the spot.
Thank you so much Dr. Langelier for sharing your insights and giving us a really interesting perspective on the future!















