Research Workflows & Documentation

LIMS vs ELN vs SDMS: What Your Lab Actually Needs

Laboratory sample tracking: scientist scanning barcoded tubes at a lab workstation

If you've ever tried to figure out the difference between a LIMS, an ELN, and an SDMS, you've probably noticed something: every vendor explains it differently, and the explanation always ends with their product being the answer. So let's skip that. This is a straightforward breakdown of LIMS vs ELN vs SDMS, written by someone who has actually used all three at the bench and who has seen the benefits (and limits) of each. 

TLDR: these three systems solve different problems. They overlap (sometimes a lot) and the lines between them have gotten blurry in the last few years. But the core jobs are distinct, and if you pick the wrong one for your lab's actual pain points, you'll spend six months implementing software that nobody wants to use.

What an ELN Actually Does

An electronic lab notebook is, at its core, a replacement for your paper lab notebook. That sounds simple. It isn't. A good ELN handles experiment documentation, lets you attach files and images, supports templates for common assays and procedures, tracks who did what and when, and keeps a version history that satisfies auditors. Some of the newer platforms, like IGOR, also tie your notebook entries directly to your inventory and sample records, which means you can trace from a result all the way back to the specific reagent lot you used.

The people who benefit most from an ELN are the ones at the bench. Grad students. Postdocs. Research scientists. Anyone who runs experiments and currently documents them in a paper notebook, a Word doc, OneNote, or some unholy combination of the three plus sticky notes on the laminar flow hood.

Where ELNs fall short is sample management at scale. If you're processing 500 clinical samples a week and need to track every one through accessioning, testing, storage, and reporting, a standalone ELN won't cut it. That's LIMS territory.

What a LIMS Actually Does

A Laboratory Information Management System (LIMS) is built around samples. It tracks the full lifecycle: when a sample arrived, who logged it, what tests were run on it, where the results went, and where the sample is now. LIMS platforms handle things like chain of custody, batch processing, certificate of analysis generation, and integration with instruments that spit out data automatically.

LIMS got its start in quality control and testing labs, and that origin still shows. If you work in a QC environment, a clinical lab, a contract research organization, or any setting where samples flow through a defined pipeline with regulatory oversight, LIMS is probably the backbone you need.

But this is where the decision gets more nuanced. If your lab is primarily doing discovery research - the kind where you might spend three weeks troubleshooting an assay before you have a single repeatable step worth documenting -  a LIMS can start to feel like wearing a suit to go hiking. It's built around structured, repeatable workflows. Early-stage research often isn't either of those things. 

The issue isn't just philosophical. LIMS platforms are, by design, fairly rigid systems. Adapting one to match an unconventional or evolving workflow typically means either bringing in internal developers or going back to your vendor — and vendor-led customization is rarely cheap. Implementation fees, customization costs, and ongoing support contracts have a way of compounding, and what looked like a straightforward software decision starts carrying a much heavier price tag than anyone budgeted for. For smaller labs or teams operating under tight R&D budgets, that financial reality can be a dealbreaker before the system even goes live.

Then there's what happens after rollout. R&D environments change constantly: research directions shift, new collaborators come on, methods get replaced. Every time that happens, someone has to figure out how to adjust the system to make sure it continues to align with your workflows. When that process is slow or expensive, people find workarounds. They keep a parallel spreadsheet, they skip logging steps, they default back to whatever they were doing before. The system technically exists, but nobody's really using it.

This is why many discovery-focused labs find that an Electronic Lab Notebook fits more naturally into their day-to-day. ELNs are generally designed with flexibility in mind. You can capture experiments as they actually unfold, search back through months of notes without needing a formal data structure, and onboard new team members without a training program. If what your lab actually needs right now is cleaner documentation and more accessible records, that's where an ELN earns its keep.

 

What an SDMS Actually Does

Scientific Data Management Systems are the least understood of the three, partly because they've been absorbed into other products and partly because the name is vague. An SDMS is fundamentally about storing, organizing, and retrieving raw instrument data. Think HPLC chromatograms, mass spec files, plate reader outputs, NMR spectra. The kind of data that sits in proprietary file formats on the C: drive of the instrument computer until someone accidentally deletes it or the hard drive dies.

A good SDMS pulls that data into a central, searchable repository, preserves the original files, and lets you associate them with projects, experiments, or samples. If your lab generates a lot of instrument data and you're currently managing it via a shared network drive, that's the pain an SDMS solves.

Most labs don't buy a standalone SDMS anymore. The functionality has been folded into modern ELN and LIMS platforms. But it's worth knowing the concept exists, because if instrument data management is your primary headache, you need to ask specifically about it during vendor evaluations, not just assume it's included.

So Where Do They Overlap?

A lot. And more every year.

Modern ELNs increasingly include inventory tracking and basic sample management. IGOR is a good example - it started as an electronic lab notebook and has since added a full inventory module, detailed sample history tracking with audit trails, and the ability to map relationships between samples and across experiments. For small-to-mid-size labs, that combination typically covers their needs completely, without the overhead of a dedicated LIMS. Other platforms like Benchling and Labguru have gone a similar route.

The benefit from such integrated systems is obvious: being able to connect your samples and reagents directly to your lab notebook entries and associated files, means you get the full context of your research all in one place - from the materials used, to the procedures followed, and the results generated. 

That’s why many enterprise LIMS platforms like LabWare and STARLIMS have been adding ELN modules, trying to cover everything under one roof. 

This convergence is real, but it creates a trap. Just because a vendor says their ELN includes LIMS features doesn't mean those features are as good as a purpose-built LIMS. And an enterprise LIMS with an ELN bolted on often has a notebook that's clunky and frustrating for bench scientists. The question isn't whether the features exist on a spec sheet. The question is whether your team will actually use them.

Honestly, this is part of why IGOR exists. I designed the first version out of frustration with one of the market-leading LIMS systems: a tool so poorly designed that scientists in my own lab reverted to paper notebooks for several months because the system was creating so many bottlenecks we were genuinely at risk of missing project deadlines. The features were there on paper. The usability wasn't. And when a lab full of scientists decides that pen and paper beats the software you're paying six figures for each year, you have not chosen the right tool.

That gap between what a system claims to do and what people will actually adopt is something you see over and over in lab software, largely because a lot of these platforms were designed and built by people who have never set foot in a lab. And it shows. 

The Decision Framework That Actually Helps

Forget about product categories for a second. Start with your lab's top three pain points and work backward.

If your main problem is experiment documentation - people can't find old data, protocols aren't standardized, onboarding a new hire takes forever because nothing is written down properly - you need an ELN first. A platform like IGOR that also handles inventory and sample tracking gives you room to grow without needing a separate LIMS right away.

If your main problem is sample throughput - you're losing track of samples, QC turnaround is too slow, auditors keep asking about chain of custody and you're scrambling to reconstruct it from paper logs - you need a LIMS. Look at STARLIMS, LabVantage, or newer options like QBench if you want something more modern.

If your main problem is instrument data - raw files scattered across instrument PCs, no centralized search, analysts spending 30 minutes finding the right chromatogram - you need SDMS capabilities. Check whether your existing software has this built in before buying something new. Many ELN and LIMS platforms now include file storage and instrument integration that handles this.

If everything is a mess (which is honestly the most common scenario) start with the problem that wastes the most person-hours per week. For most research labs, that's experiment documentation and finding old data. For most production and QC labs, it's sample tracking and compliance.

Size Matters More Than You Think

A 10-person academic lab and a 200-person pharma R&D group can both say they need better lab data management. But the right answer is very different.

Small labs and startups should start with an ELN that has built-in inventory and basic sample tracking. It's faster to implement, cheaper per user, and you'll actually get people to adopt it. Trying to deploy an enterprise LIMS in a 15-person lab is like buying a semi truck to move apartments. It'll technically work, but you'll hate the process and you'll have spent way too much money.

Mid-size labs - maybe 20 to 80 scientists - are where the decision gets harder. You need good experiment documentation AND solid sample management. This is where the unified platforms make the most sense, or where you pair a strong ELN with a LIMS that integrates well with it.

Enterprise labs usually end up with both a LIMS and an ELN, sometimes from the same vendor, sometimes not. At that scale you often also need dedicated SDMS for high-volume instrument data, plus a data warehouse that sits above everything. The implementation takes a year or more and costs a small fortune. That's the reality.

The Questions Nobody Asks (But Should)

When you're sitting in a vendor demo, everyone asks about features. Here are the questions that matter more.

Can your team configure it without the vendor's help? Every platform says it's configurable. In practice, many require vendor professional services for anything beyond basic setup. That's expensive and slow. Look for tools that let your lab manager add a new sample type or modify a template without calling anyone.

What's the demo experience like? There's a big difference between seeing a platform in action and being walked through a curated slide deck with a few staged screenshots dropped in. Sales demos are designed to impress, not to inform. And a polished presentation that never shows you the real interface up close should tell you something. If a vendor can't or won't open the application and just use it in front of you, that's worth paying attention to. The goal of the demo (from your perspective) is to give you a realistic feel for what it would be like for your team to use the software every single day. If you leave the call without that, you haven't really seen the product yet.

How well does it handle the basics? And I mean the real basics” logging out samples, sure, but also undoing that when you realize you grabbed the wrong one. It sounds like a trivial thing to ask. You'd be surprised though how many platforms already fall short on these simple tasks. Before you commit to anything, ask for a free trial and run your own workflows through it. Or, at minimum, get them on a call and ask them to click through your actual day-to-day processes live - not a polished demo script, your workflows. That's where you find out what you're actually buying.

How easy is it to get data out? If you ever need to switch platforms, can you export your data in standard formats, or are you locked in? This is the question that makes vendors uncomfortable, and the answer tells you a lot.

What does the onboarding process actually look like? Not the brochure version - the real timeline, who does the work, and how much will it cost? Some platforms come with a dedicated implementation team and a structured rollout. Others hand you a login, point you to a help center, and wish you luck. Or charge you thousands to have your team trained on the software. Ask specifically how long it typically takes for a lab your size to be fully up and running, and whether that timeline includes training for bench scientists or just system administrators. If the answer is vague, that's usually because the process is painful or expensive and they'd rather you find out after you've signed.

Where This Is All Heading

The trend is clearly toward convergence. Five years from now, the distinction between ELN and LIMS will be even blurrier than it is today. AI is accelerating this because it works best when your data is connected, and keeping experiment records in one silo and sample data in another makes AI much less useful. The platforms that structure data well and keep everything linked are going to have a significant advantage.

But the convergence isn't complete yet, and it matters which direction you approach from. An ELN that grows into LIMS territory tends to keep the scientist-friendly interface and flexibility that research labs need. A LIMS that adds an ELN module tends to keep the structured, compliance-heavy architecture that QC and production labs need. Starting from the wrong direction means fighting against the tool's design philosophy.

Pick the system that solves your most urgent problem really well, and make sure it has a credible path to grow with you. That's more useful advice than any feature comparison matrix.

Anika Weber, DPhil; CEO and Founder of IGOR