国家预印本平台
中国首发,全球知晓
The third law of thermodynamics has been verified experimentally, but how to perfectly express such a law in theory has become a cross-century problematic issue. It is found that by introducing an innovative method, the Nernst equation can be obtained directly from the experimental data of chemical reactions at low temperatures without the need for artificial additional assumptions appearing in textbooks, so that the Nernst theorem should be replaced by the Nernst statement. It is also found that the heat capacity statement can be obtained from the experimental data of the heat capacity at low temperatures. The heat capacity statement and the Nernst statement are proved to be mutually derivable and the two are equivalent. The unattainability principle of absolute zero temperature is only a corollary of the Nernst statement or the heat capacity statement. Simultaneously, the defects and deficiencies related to the contents of the third law of thermodynamics appearing in textbooks are pointed out and corrected. The results obtained show clearly that the Nernst theorem and the unattainability principle of absolute zero temperature should be withdrawn from the statements of the third law of thermodynamics. It is important to find that the Nernst statement and the heat capacity statement are two equivalent statements of the third law of thermodynamics, which can solve the centennial debate problems of the third law of thermodynamics and supply the perfect statements for the third law of thermodynamics.
This manuscript studies nodal clustering in graphs having multivariate attributes at each node. The framework includes node-specific priors for low-dimensional representations, coupled with a neural decoder that bridges observed attributes with latent variables. Structural and attribute information are incorporated through a graph-fused LASSO regularization on the prior means, promoting nodal clustering. The optimization problem is solved via alternating direction method of multipliers, with Langevin dynamics for posterior inference. Simulation studies on grid graphs, and applications to real data with complex settings, demonstrate the effectiveness of the proposed clustering method.
Fault-tolerant quantum computing requires understanding how error-correcting codes perform on diverse physical hardware. This is typically assessed via noisy stabilizer simulation of logical circuits at HPC scale, combined with a noise model that yields a logical error rate for the relevant code distances and depths. The uniform depolarizing model is the standard baseline, but its homogeneous assumptions fail to capture the heterogeneity, asymmetries, and correlations of real devices, where Pauli, measurement, and spatio-temporal errors are not weakly coupled. Yet these same structured features create opportunities for joint code-hardware co-design, motivating noise models that more faithfully reflect target hardware while remaining tractable to simulate. We introduce FTPrimitiveBench, a systematic benchmarking approach for studying how logical primitives interact with hardware-motivated noise. It supports both custom specifications and representative structured noise families: Pauli bias, measurement bias, and spatial or spatio-temporal non-uniformity -- together with generators for core surface-code Clifford primitives: logical memory, lattice surgery, transversal logical Hadamard, and the logical phase gate via lattice surgery. We find that structured noise affects these primitives in qualitatively distinct ways, with outcomes shaped by the interplay between noise model, primitive, and decoder choice. These results extend memory benchmarks to active logical computation, where the interaction between noise structure and primitive implementation matters. By standardizing the link between noise-model specification and primitive construction, FTPrimitiveBench enables reproducible comparative studies of QEC protocols and decoders, supporting hardware-aware co-design of fault-tolerant architectures. Code: https://github.com/ShuwenKan/FTPrimitiveBench.
We study a class of branching processes in which the offspring distribution is not specified directly but is induced by a cycle of internal colony growth, catastrophic reduction and structured dispersal. The parameters governing growth, survival and dispersal are allowed to vary deterministically or randomly from one generation to the next, giving rise to branching processes in varying and random environments with implicitly defined offspring laws. We show that survival and extinction are governed entirely by the associated log-mean process, exactly as in the classical theory. The paper treats four qualitatively different dispersal mechanisms and establishes a universal ordering of the induced offspring means. For Poissonian growth with binomial survival, explicit thresholds are obtained that determine extinction or survival uniformly over all four mechanisms. A series of ecologically motivated examples with Yule-Simon growth illustrates the versatility of the framework.
This study investigates the performance and ergotropy protection of open collective quantum batteries subject to superradiant decay. By employing a passive spectral detuning strategy within an intermediate cavity, an optimal detuning value ($Î^*$) is analytically derived and numerically verified to spectrally isolate the system and protect quantum coherence, achieving up to 1088% ergotropy improvement for single qubits and superextensive collective advantage for $N \ge 3$. Our analysis resolves a "non-Markovian paradox," revealing that maximizing ergotropy does not strictly require non-Markovian memory; rather, suppressing environmental memory via detuning optimally preserves coherence, which serves as the fundamental resource. Survival maps across different environments demonstrate that thermal noise dissipates coherence more severely than telegraph noise. Finally, we establish that collective amplification of the effective coupling ($g_{\rm eff} = g\sqrt{N})$ inevitably drives large qubit arrays into the ultra-strong coupling regime, providing a quantitative ceiling $N_{\rm max}$ on the validity of the Tavis-Cummings description and the current ergotropy protection protocol.














