Skip to content

Quantinuum Enhances The World’s First Quantum Natural Language Processing Toolkit Making It Even More Powerful

Quantinuum Enhances The World’s First Quantum Natural Language Processing Toolkit Making It Even More Powerful

Quantinuum is a software and hardware integrated quantum computing company that uses trapped ions for its computing technology. It recently released a significant update to its open source Python library and tools from Lambeq, named after mathematician Joachim Lambek. Lambeq (written with Q for quantum) is the first and only toolkit that converts sentences into quantum circuits using sentence meaning and structure to define quantum entanglement.

Cambridge Quantum initially developed the toolkit before merging with Honeywell Quantum Solutions to form a new company called Quantinuum. Within the combined company, Cambridge Quantum acts as the quantum software arm.

According to Ilyas Khan, CEO of Quantinuum, Cambridge Quantum is still marketed under its brand due to a large customer base and significant commercial and technical relationships within the industry.

Why is quantitative natural language processing important

Natural Language Processing (NLP) is a form of artificial intelligence that allows computers to understand words and sentences. All sectors of the industry make heavy use of NLP, and usage is expected to grow annually by more than 27% in the next five years.

While NLP is powerful, quantitative natural language processing (QNLP) promises to be even more powerful than NLP by turning language into coded circuits that can run on quantum computers.

Applying QNLP to AI can greatly improve it. A large amount of data is required to train AI models, and quantum computing will greatly speed up the training process, possibly reducing months of training to mere hours or minutes.

Existing NLP models built using transformer models and deep neural networks consume a great deal of energy creating environmental concerns. Within this decade, quantum computers will expand from hundreds to millions of qubits, enabling extended versions of QNLP that will be more efficient, faster, process massive data sets, require less power with less environmental impact.

Lampeq improvements

The new Quantinuum Lambeq QNLP version contains several major improvements:

  • Lambeq’s training package supports popular supervised learning libraries such as PyTorch to help users efficiently train NLP tasks using constructed quantum circuits and tensor networks.
  • Toolkit can now create more quantum circuits
  • It would be easier to define quantum circuits with syntax using context-free (syntax) diagrams
  • The toolkit has improved the visualization of its outputs
  • The extended documentation contains many examples that ordinary users should follow
  • A new command line interface is now available, making most of the toolkit’s functionality available to users with little or no programming knowledge.
  • New moderated training module simplifies training of parameterized quantum circuits and tensor networks for machine learning

In addition to the above improvements, Lambeq has a new neural analyzer called Bobcat. Analyzers determine the meaning of a sentence by dividing it into its parts. Humans, rather than computers, performed the training on Bobcat’s word dataset and information sources. As a community feature, Bobcat will also be released as a separate standalone open source tool at some point in the future.

Since its introduction, researchers have used the toolkit to develop hands-on, real-world QNLP applications such as automated dialogue, text extraction, language translation, text-to-speech, language generation, and bioinformatics.

How does quantitative natural language processing work?

Converting a sentence into quantum circuits is complicated. However, QNLP has the advantage of being “quantum native,” meaning that the language has a synthetic structure that is mathematically similar to the structure used in quantum systems. Also, Lambeq has a modular design that enables users to switch components on and off the model, providing flexibility in architecture design. Here is a simplified version of how the QNLP process works:

  • QNLP converts the sentence into a logical format (composition tree) so that the computer can understand it.
  • The program organizes the syntax tree into parts of speech using mathematical linguistics to distinguish between verbs, nouns, prepositions and adjectives.
  • Then the parts of the sentence are categorized according to the relationships between the words.
  • The sentence is converted into a string diagram, much like the sentence diagrams you learned in elementary school.
  • After they are encoded, tensor networks or quantum circuits implemented using TKET are ready to be optimized for machine learning tasks such as text classification.


It is important to note that two out of three true pioneers in QNLP are now working together as senior researchers at Quantinuum. Professor Stephen Clark is Quantinium’s Head of Artificial Intelligence, and Professor Bob Quick is its chief scientist.

The basis of QNLP is the DisCoCat framework (Categorical Synthetic Distribution Model) developed by Bob Cook, Stephen Clark, and Mehrnoush Sarzadeh in 2010. Once published, DisCoCat quickly became the gold standard because it was unique to all other methods by combining meaning and grammar in a language model. one.

Patrick Moorhead, Founder, CEO and Chief Analyst at Moor Insights & Strategy, recently had an interesting two-part discussion on QNLP, quantum and artificial intelligence with Bob Coecke, Chief Scientist at Quantinuum. The videos can be accessed on the Moor Insights and Strategy YouTube channel. The first part can be seen here and the second part can be viewed here.

Many important NLP applications are beyond the capacity of classical computers. As QNLP and quantum computers continue to improve and expand, many practical commercial quantum applications will emerge along the way. Given the expertise and experience of Professor Clark and Professor Quick, as well as a collegial body for QNLP research, Quantinuum has a clear strategic advantage in current and future QNLP applications.

Analyst notes:

  • In last year’s announcement, the instrument cluster was named “lambeq”. Research papers refer to the toolkit in the same way. However, Quantinuum’s current announcement refers to the toolkit as λambeq. For simplicity, I refer to the toolkit as “Lambeq” in this article.
  • QNLP research is still in the experimental stage. It will be several years before you are old enough to deploy in a large production environment. QNLP is essentially unexplored, and Lambeq has opened the door for researchers to use and develop the technology for broader, larger, and more unique applications.
  • QNLP limitations result from the expansion limitations of existing NISQ machines. But despite these limitations, early research into QNLP is essential to help us understand and advance the science of both QNLP and quantum computing. Ultimately, there will be important real-world uses for QNLP.
  • While it is true that Lambeq is the first toolkit for QNLP, there are other toolkits created for general QML development.
  • Lambeq is available as a traditional Python repository on GitHub and is available here:
  • More details about the new version can be found here.
  • Documentation and tutorials can be found here.

Follow Paul Smith Goodson Twitter For current information about quantum and artificial intelligence

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and technology industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, counseling, benchmarking, acquisition matchmaking, or speaking sponsorship. The company owns or has paid business relationships with 8×8, A10 Networks, Advanced Micro Devices, Amazon, Ambient Scientific, Anuta Networks, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), AT&T, AWS, and A-10 Strategies , Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, Echelon and Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, HP Inc. and Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ionever, Insego, Infosys, Inviot, Intel, Interdigital, Jbeil Circuit, Konica Minolta, Lattis Semiconductor, Lenovo, Linux Foundation, Luminar, Map Box, Marvell Technology , Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Mesophere, Microsoft, Mojo Newo rks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm) ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas, Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics), Portworks, Pure Storage, Qualcomm, Rackspace, Rambus, Rivault e-bikes, Red Hat, Resideo, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connec tivity, TensTorrent, Tobii Technology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zedida, Zoho, Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in the technology companies dMY Technology Group Inc. VI and Dreamium Labs.

Source link