Explore BMC Psychology, an open access journal covering a broad range of psychology topics. Discover research on developmental, clinical, cognitive, and social psychology, as well as the intersection of psychology with law, policy, and individual differences. Find innovative research in BMC Psychology, an open access journal with special collections, 2.7 Impact Factor and 22 days to first decision. BMC ...
BMC has an evolving portfolio of some 300 peer-reviewed journals, sharing discoveries from research communities in science, technology, engineering and medicine. In 1999 we made high quality research open to everyone who needed to access it – and in making the open access model sustainable, we changed the world of academic publishing. We are committed to continual innovation in research publishing to better support the needs of our communities, ensuring the integrity of the research we publish and championing the benefits of open research for all. Our leading research journals include selective titles such as BMC Biology, BMC Medicine, Genome Biology, Genome Medicine and BMC Global and Public Health, academic journals such as Journal of Hematology & Oncology, Malaria Journal and Microbiome, and the BMC series, 53 inclusive journals focused on the needs of individual research communities. We also partner with leading institutions and societies to publish journals on their behalf. BMC is part of Springer Nature, giving us greater opportunities to help authors everywhere make more connections with research communities across the world.
DeepSeek-R1 is a reasoning model trained via large-scale reinforcement learning (RL) without the need for supervised fine-tuning (SFT). It demonstrates remarkable performance in reasoning tasks, including self-verification and reflection. The model addresses challenges such as endless repetition and poor readability, and achieves performance comparable to OpenAI-o1 across math, code, and reasoning tasks.
PSYCH OpenIR is the institutional repository of the Institute of Psychology, Chinese Academy of Sciences, providing access to research outputs and information on researchers' profiles and research directions.
DeepSeek-V3 is a powerful Mixture-of-Experts (MoE) language model with 671 billion total parameters and 37 billion activated parameters per token. It achieves efficient inference and cost-effective training through innovative load balancing strategies and multi-token prediction training objectives. The model is pre-trained on 14.8 trillion diverse and high-quality tokens, and it outperforms other open-source models in various benchmarks.