Research papers in edge computing for Industrial Internet of Things (IIoT) focus on leveraging computational resources near the data source to improve latency, reliability, and efficiency in industrial operations. IIoT environments generate massive amounts of real-time data from sensors, actuators, and machines, and transmitting all this data to centralized cloud servers can introduce high latency, bandwidth bottlenecks, and potential security risks. Edge computing addresses these challenges by processing, analyzing, and filtering data at or near the network edge, enabling real-time decision-making, predictive maintenance, anomaly detection, and operational optimization. Researchers have explored architectures that integrate edge nodes with fog and cloud layers, optimizing task allocation based on resource availability, network conditions, and application criticality. Security and privacy are key considerations, with studies proposing lightweight encryption, authentication, and trust frameworks for edge-assisted IIoT systems. AI and machine learning techniques are increasingly integrated at the edge for predictive analytics, fault detection, and intelligent automation, while minimizing data transmission and reducing energy consumption. Hybrid models combining edge intelligence with blockchain, digital twins, and Software-Defined Networking (SDN) have been investigated to enhance transparency, traceability, and network resilience. Performance analyses typically focus on latency reduction, throughput improvement, energy efficiency, and scalability in heterogeneous IIoT deployments. Despite these advancements, challenges remain in managing resource-constrained edge nodes, ensuring secure and reliable communication, and achieving seamless interoperability with legacy industrial systems. Overall, the literature underscores that edge computing is a critical enabler for real-time, secure, and efficient IIoT applications, bridging the gap between constrained devices and centralized cloud infrastructures.