Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.14 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)
The rapid advancement of artificial intelligence (AI), especially in the domains of machine learning and deep neural networks (DNNs), has created a significant demand for high-performance, low-power computing platforms. Traditional processors are inadequate for handling the massive parallelism and data-intensive nature of AI workloads. To address this, VLSI-based AI accelerators have emerged as a crucial solution, offering specialized hardware architectures optimized for AI tasks. These accelerators incorporate techniques such as systolic arrays for matrix operations, processing-in-memory (PIM) to reduce data movement, and low-precision arithmetic (e.g., INT8, binary) for efficient computation. Advanced memory hierarchy designs and custom multiply-accumulate (MAC) units further enhance performance and energy efficiency. Platforms like Google’s TPU, NPUs, and FPGA-based designs are widely adopted in both data centers and edge devices. Additionally, hardware-software co-design, quantization-aware training, and neural architecture search (NAS) tailored for hardware constraints are becoming essential in modern VLSI design. This evolving field not only improves AI processing capabilities but also opens new research opportunities in building scalable, power-efficient, and real-time AI systems integrated into SoC platforms.
Keywords:
VLSI Design, AI Accelerators, Deep Neural Networks (DNNs), Systolic Arrays, Processing-in-Memory (PIM), Low-Power Design, Multiply-Accumulate Units (MAC), Hardware-Software Co-Design, Neural Architecture Search (NAS), Edge AI, High-Performance Computing (HPC).
Cite Article:
"Architectural Advances in VLSI for Efficient AI Processing", International Journal for Research Trends and Innovation (www.ijrti.org), ISSN:2455-2631, Vol.10, Issue 7, page no.a234-a238, July-2025, Available :http://www.ijrti.org/papers/IJRTI2507029.pdf
Downloads:
000461
ISSN:
2456-3315 | IMPACT FACTOR: 8.14 Calculated By Google Scholar| ESTD YEAR: 2016
An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 8.14 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator