Financial Markets

AI POWERHOUSE ON A LAPTOP: SCIENTISTS HARNESS LOCAL LANGUAGE MODELS FOR ADVANCED RESEARCH AND PRIVACY!

The growing influence of Artificial Intelligence (AI) in shaping our future continues to take intriguing turns, with the advent of local, compact versions of Large Language Models (LLMs) being a significant leap forward. Applied effectively, these highly sophisticated data models are transforming ways in which we approach issues including health research, eCommerce, and privacy preservation.

Chris Thorpe, a notable bioinformatician, runs histo.fyi, a dedicated database of immune-system proteins identified as MHC molecules. This complex endeavour is fuelled by Thorpe's extensive use of LLMs which, over time, have become instrumental in powering advanced chatbots that charm users through poetry and engaging conversations.

Traditionally, LLMs have been associated with vast computer systems due to their computational needs, presenting bottlenecks for widespread application. However, in a turning tide, organizations have been focusing on 'open weight' models, compactly designed versions that can be downloaded and run locally on an individual's computer. This reduction in size and computational requirements, along with specific advances made, put this technology within the grasp of many more users.

Microsoft is at the forefront of embracing this shift, having released several versions of its Phi model, ranging from 3.8 billion to 14 billion parameters. These compact versions exhibit performances that compete with their more massive predecessors.

There are undeniable advantages to these local models. First, they offer privacy protection by keeping data on the user's system rather than transmitting it to commercial servers — a genuine privacy segway in a world increasingly concerned with data leaks and personal information protection. Second, they are cost-effective and don't require a network connection, making them accessible to researchers in remote areas. Finally, these models ensure reproducibility of findings, as they don’t change unexpectedly.

Chinese eCommerce giant Alibaba has also appreciated the functionality of open weight models, developing its own versions, Qwen, with parameters ranging from 500 million to 72 billion. These models drive Turbcat-72b, aimed at aiding researchers.

In the field of healthcare, researchers are pioneering the use of open-weight models, keenly focusing on patient privacy. Development in this sector is still in the experimental phase, as scientists explore the numerous options available for their exact requirements.

For coding purposes, cloud-based tools such as GitHub Copilot still hold preference over local AI tools, mainly due to wider collaborative possibilities and ease of updates. Nevertheless, recently released software like Ollama allows users to download and run open models locally on their devices, further pushing the frontier of possibilities.

Looking at the past year’s rapid, consequential advances, it is clear that local LLM technology is on the threshold of becoming suitable for most applications in the near future. This trend reflects not only progress in AI technology but our evolving relationship with it, as we strive for a balance between technological efficacy and the preservation of privacy. As LLMs become more accessible and powerful, they contain the potential to drive a profound impact on our future.