UMBC has a long history of faculty in Computer Science and Information Systems working on research that would be classified as artificial intelligence (AI) dating back to the mid-1980s. The UMBC Center for AI has more than 60 faculty members with research interests in AI and related areas, including robotics, machine learning, natural language understanding, data science, image processing, multi-agent systems, large language models, knowledge representation, and reasoning, planning, knowledge graphs, and neural networks. These faculty members work in more than 30 laboratories and research centers and teach many AI-related courses across departments and disciplines.
With the recent introduction of generative AI, we now see the opportunity for AI, specifically large-language models (LLM) to be used for research in many different ways: to improve research productivity in organizing documents, as a way to summarize qualitative data, or to be used in the development of research. This work often utilizes a variety of resources, some of the resources available for AI research are listed below:
chip-cpu (formerly the ada) cluster
AI computation often relies upon the very specialized graphic processor units (GPUs).
The chip-cpu cluster consists of more than 20 distinct server computers or “nodes”. Each is equipped with NVidia GPU Cards from a variety of architectures (RTX 2080Ti, RTX 6000, RTX 8000, L40S, H100). The SIG-GPU subcommittee is a faculty governance group that determines the chip-cpu usage policies and provides advice to DoIT on evolving research needs. For more information on the advanced compute resources made available via the UMBC High Performance Computing Facility, see the hpcf.umbc.edu webpage.
Year Purchased | CPU Cores | CPU Mem | GPU Cards | GPU Mem | Number | CPU Arch |
2020 | 48 | 384 GB | 8 (RTX 2080Ti) | 11 GB | 4 | Intel Cascade Lake |
2020 | 48 | 384 GB | 8 (RTX 6000) | 24 GB | 7 | Intel Cascade Lake |
2020 | 48 | 768 GB | 8 (RTX 8000) | 48 GB | 2 | Intel Cascade Lake |
2024 | 32 | 256 GB | 2 (H100) |
100 GB |
2 | Intel Emerald Rapids |
2024 | 32 | 256 GB | 4 (L40S) | 48 GB | 8 | Intel Emerald Rapids |
In addition to the chip-gpu cluster, DoIT has purchased a machine with two H100 NVIDIA GPUs that is being configured to run Meta’s open-source Llama LLM. This LLM provides faculty with the opportunity to run LLM-based research without incurring costs when LLMs must be run programmatically.
Cloud Computing AI
UMBC has cloud computing contracts for Microsoft Azure (Open AI), Google Cloud (Gemini AI), and Amazon AWS Bedrock (Anthropic and Meta). The vendor cloud computing options provide a wide range of commercial options that faculty can utilize in their research. There is a cost when using cloud computing, DoIT has contracts in place, and if your grant allows the purchase of cloud computing resources, we can set up accounts specific to your grant for chargebacks. DoIT has worked with some different cloud vendors and has found that cloud computing costs for AI are quite reasonable, if architected correctly. Please submit a ticket and let DoIT know how we can help you.
Resources:
One advantage of using cloud computing for generative AI development is that each of the cloud computing vendors has built powerful development environments with all the appropriate libraries needed to build a generative AI application.
Microsoft Azure Foundry (formerly Azure AI Studio)
Google Vertex AI
Using Generative AI for Research Productivity
Each week there are new developments in generative AI. For faculty interested in getting started, we recommend exploring some of the GenAI Tools that are free. Faculty can request access to Amplify through the AI Tools Support. Alternately, if you heavily use Microsoft software, consider getting CoPilot or if you are a heavy Google user, try out Gemini AI.