Labs the world over are jumping onto the gene editing bandwagon (and into the inevitable patent arguments). And it’s hard to blame them. As these technologies have evolved over the last two decades starting with the zinc finger nucleases (ZFNs), followed by transcription activator-like effector nucleases (TALENs) and CRISPR—they’ve become ever more powerful and easier to use.
But one question keeps coming up: How precise are these systems? After all, a method that selectively mutates, deletes or swaps specific gene sequences (and now can even turn genes on) is only as good as its accuracy.
Algorithms can predict the likely “off-target” edits based on the target’s DNA sequence, but they’re based on limited data. “The algorithms are getting better,” says Richard Frock, PhD, a fellow in the laboratory of Frederick Alt, PhD, at Boston Children’s Hospital. “But you still worry about the one rare off-target effect that’s not predicted but falls in a coding region and totally debilitates a gene.”
Frock, Alt (who leads Boston Children’s Program in Cellular and Molecular Medicine, or PCMM), fellow Jiazhi Hu, PhD, and their collaborators recently turned a method first developed in Alt’s lab for studying broken chromosomes into a quality assurance tool for genome editing. As a bonus, the method—called high-throughput genome translocation sequencing (HTGTS)—also reveals the “collateral damage” gene editing methods might create in a cell’s genome, information that could help researchers make better choices when designing gene editing experiments.
The basics of breaking chromosomes
To break a chromosome, both of its DNA strands have to snap or be cut, creating a double-strand break. The resulting broken pieces can either re-attach where they came from or form a translocation by attaching to other broken pieces from their own or other chromosomes.
In cancer cells, chromosomes break and translocate as the cells’ genomes become structurally unstable. And B cells use double strand breaks to shuffle their genes and generate our diverse antibody repertoire.
“[High-throughput gene translocation sequencing] will help researchers find which enzymes work best and, if needed, modify them to work better. It will help the field do what it wants to do.”
With ZFNs, TALENs and CRISPR, researchers essentially use enzymes to break chromosomes on purpose. But what worries Alt, Frock and Hu is the potential for collateral damage within the cell.
Even when the enzymes make “on-target” cuts on both copies of a chromosome, the chromosome pieces can rejoin in six different ways, some of which are unstable and could fuel further breaks, says Alt, who was recently honored with Brandeis University’s Rosenstiel Award for Distinguished Work in Biomedical Science. That’s just the baseline: as the number of off-target cuts goes up, the number of possible translocations increases rapidly.
It comes down to a matter of frequency. How often does a particular enzyme and its target create off-target cuts, and how often do those cuts lead to potentially problematic translocations?
“What you’d like to have is a really sensitive and easy assay that lets you sort through a number of target/enzyme combinations and look genome-wide to see the frequency of off-target cuts and patterns of translocation, and understand their possible effects on the cell,” Alt says.
Not bait and switch, but bait and prey
This is where HTGTS comes in. Originally developed in Alt’s lab to study cancers and antibody development, it gives a whole-genome view of chromosome breaks and resulting translocations. Adapted as a gene-editing assay, HTGTS cheaply and rapidly exposes potential off-target problems in an enzyme- and target-agnostic way.
It works like this: researchers run their gene-editing experiment using their enzyme and target DNA sequence of choice, letting chromosomes break and translocations form. Using their target sequence as ‘bait,’ they then sequence the reshuffled chromosomes hunting for ‘prey’: sequences that have attached themselves to the target as a result of the chromosome ends rejoining. This lets the scientists look for translocations that could cause problems, such as ones that inadvertently activate oncogenes or deactivate tumor suppressors.
To test the assay, Alt’s team looked at several enzyme/target combinations for editing RAG1, a gene implicated in some immunodeficiencies. As they report in Nature Biotechnology, they found and validated dozens more cuts and translocations at varying frequencies than had been predicted with existing algorithms.
Arming researchers with information
Alt doesn’t see the data as cautionary or limiting on gene editing as a field. Rather, he thinks that such information will help expand the use of gene editing enzymes.
“The assay gives researchers as much information as possible to run their experiments for a given purpose,” Alt says. “It shows how by using the right levels of enzyme you can balance on- and off-target breaks, and allows researchers to find which enzymes work best and, if needed, to modify them to work better. It will help the field do what it wants to do.”
“Gene editing technologies are moving toward clinical development, based on their promise for treating genetic diseases right at the level of the gene,” says PCMM’s Derrick Rossi, PhD, who recently looked at CRISPR’s accuracy when used to cut genes out of human hematopoietic stem cells and who is collaborating with Alt to refine HTGTS for assaying clinically relevant cells. “But it’s neither cost effective nor technically possible to sequence the entire human genome at the depth required to detect rare off-target effects in an unbiased manner. Fred’s technology solves that problem, and you don’t need any algorithm or prediction at all.
“I think Fred’s technology will become integral to the entire field,” he adds. “As companies move forward with bringing gene editing-based therapeutics to people, this will be part of the quality control.”
Read Alt, Frock and Hu’s paper in Nature Biotechnology.