IJRTI
International Journal for Research Trends and Innovation
International Peer Reviewed & Refereed Journals, Open Access Journal
ISSN Approved Journal No: 2456-3315 | Impact factor: 8.14 | ESTD Year: 2016
Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.14 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)

Call For Paper

For Authors

Forms / Download

Published Issue Details

Editorial Board

Other IMP Links

Facts & Figure

Impact Factor : 8.14

Issue per Year : 12

Volume Published : 10

Issue Published : 115

Article Submitted : 19457

Article Published : 8041

Total Authors : 21252

Total Reviewer : 769

Total Countries : 144

Indexing Partner

Licence

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Published Paper Details
Paper Title: ReBERT- An Enhanced BERT
Authors Name: Priyanka Shee , Santayo Kundu , Anirban Bhar , Moumita Ghosh
Download E-Certificate: Download
Author Reg. ID:
IJRTI_188286
Published Paper Id: IJRTI2310067
Published In: Volume 8 Issue 10, October-2023
DOI:
Abstract: BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based language model that comprehends the context of words by considering surrounding words in both directions. It revolutionized natural language processing by capturing rich contextual information, enhancing performance in various language understanding tasks like sentiment analysis, text classification. In this article, focusing on User Generated Content (UGC) in a resource-scarce scenario, we study the ability of BERT (Devlinet al., 2018) to perform lexical normalization. by enhancing its architecture and by carefully finetuning it, we show that BERT can be a competitive lexical normalization model without the need of any UGC resources aside from 3,000 training sentences. The enhanced BERT model features a hierarchical contextualization module for improved long-range dependency understanding, a domain-specific adaptation layer for specialized language contexts, and efficiency optimization through dynamic attention head pruning and weight sharing. Fine-tuned pre-training broadens language comprehension, while task-specific heads enable fine-tuning. Rigorous evaluation and iterative refinement ensure performance enhancement across tasks, addressing limitations and advancing language understanding. It will be our first work done in adapting and analyzing the ability of this model to handle noisy UGC data.
Keywords: BERT, NLP, User-Generated Content (UGC), Syntactic Parsing, Name-Entity Recognition
Cite Article: "ReBERT- An Enhanced BERT", International Journal of Science & Engineering Development Research (www.ijrti.org), ISSN:2455-2631, Vol.8, Issue 10, page no.484 - 494, October-2023, Available :http://www.ijrti.org/papers/IJRTI2310067.pdf
Downloads: 000205152
ISSN: 2456-3315 | IMPACT FACTOR: 8.14 Calculated By Google Scholar| ESTD YEAR: 2016
An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 8.14 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator
Publication Details: Published Paper ID: IJRTI2310067
Registration ID:188286
Published In: Volume 8 Issue 10, October-2023
DOI (Digital Object Identifier):
Page No: 484 - 494
Country: -, -, India
Research Area: Engineering
Publisher : IJ Publication
Published Paper URL : https://www.ijrti.org/viewpaperforall?paper=IJRTI2310067
Published Paper PDF: https://www.ijrti.org/papers/IJRTI2310067
Share Article:

Click Here to Download This Article

Article Preview
Click Here to Download This Article

Major Indexing from www.ijrti.org
Google Scholar ResearcherID Thomson Reuters Mendeley : reference manager Academia.edu
arXiv.org : cornell university library Research Gate CiteSeerX DOAJ : Directory of Open Access Journals
DRJI Index Copernicus International Scribd DocStoc

ISSN Details

ISSN: 2456-3315
Impact Factor: 8.14 and ISSN APPROVED, Journal Starting Year (ESTD) : 2016

DOI (A digital object identifier)


Providing A digital object identifier by DOI.ONE
How to Get DOI?

Conference

Open Access License Policy

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

Creative Commons License This material is Open Knowledge This material is Open Data This material is Open Content

Important Details

Join RMS/Earn 300

IJRTI

WhatsApp
Click Here

Indexing Partner