高级检索
学术出版正在走向数据化和智能化。这场基于大数据、新平台和人工智能的转型将改变科学传播范式和出版商业模式。面对学术传播的数字未来,无论出版巨头、创业公司、还是学术机构都在积极布局,抢占先发优势。本文聚焦于数据、算法与平台三元素的互动,从科研实验数据出版、引用学术出版的未来趋势及数据共享、数据智能挖掘和新旧学术平台竞合等方面梳理2017年欧美学术出版的发展创新。文章结合学术出版的未来趋势及全球开放数据运动,对中国知识服务的发展提出一些思考和建议。
In an era where journals are under mounting pressure to implement double-blind peer review while handling rapidly increasing submission volumes, editorial offices still depend largely on manual redaction or coarse “document inspector” tools to remove identifying details from manuscripts and reviewer reports. These practices are labor intensive, difficult to standardize across editors, and often act as blunt instruments that disrupt the review process by removing useful layout metadata together with sensitive information. They also provide limited protection against implicit semantic leakage, To mitigate these risks and reconcile the tension between robust privacy protection and editorial efficiency, this study proposes an intelligent bidirectional privacy anonymization strategy that integrates rule-based algorithms with large language models (LLMs) and implements it as a scalable browser/server application aligned with editorial workflows. Grounded in an analysis of typical submission materials, the system formalizes three design dimensions: supported file formats, sensitive information categories and high-risk document locations. It supports mainstream word-processing formats, targets core identifiers for authors and reviewers, and concentrates on predefined high-risk locations. On this foundation, we construct a two-layer hybrid engine. A rule-based layer, implemented against the Office Open XML schema, uses regular expressions and structural cues to deterministically locate and neutralize well-structured fields such as author lists, affiliations and email addresses while explicitly protecting in-text citations and reference lists as spans that must not be altered. An LLM-based layer is then invoked through structured prompts that encode editorial heuristics and send only minimal, context-tagged text segments to the model. This layer identifies and masks residual identity cues that escape rule-based detection—the "long tail" of semantic leakage. For PDF files, whose internal structure is less amenable to safe in-place editing, the system adopts a non-destructive “sensitive-information warning” mode in which extracted text is screened and suspected identifiers are flagged for manual verification rather than being automatically rewritten. The hybrid approach is extended symmetrically to reviewer reports. For DOC/DOCX files, the system parses comment and revision nodes and replaces user names and contact details with neutral labels such as "journal editor" or "reviewer A" while preserving the original review content; for PDF reports, suspected identity fields are similarly highlighted for anonymization or human confirmation. The anonymization engine is exposed through a web interface and standardized application programming interfaces, enabling on-demand use by editors and integration with editorial management systems at key workflow stages. An internal evaluation of real manuscripts and reviewer reports by experienced editors indicates that the rule-plus-LLM strategy more reliably removes explicit identifiers and reduces implicit identity cues in high-risk locations than manual or rule-only approaches, without altering in-text citations or reference lists, and substantially shortens the preparation time for double-blind review. Comparison of manual, rule-only, LLM-only and hybrid schemes suggests that the proposed engine achieves a favourable balance of precision, coverage, consistency and operational cost. Overall, this study demonstrates the feasibility of deploying a rules-plus-LLM hybrid engine for intelligent bidirectional anonymization in journal peer review and offers a practical, scalable pathway for journals seeking to strengthen privacy protection and editorial efficiency, while building a fairer and more trustworthy peer-review ecosystem.
The rapid advancement and deepening integration of digital technologies have fundamentally reshaped the operational ecosystem of the publishing industry. In the era of comprehensive digitalization, the traditional operational models of publishing units are increasingly insufficient to address the evolving demands of new business formats, characterized by the convergence of content forms, the personalization of user needs, and the diversification of market competition. This paper systematically examines the ecological transformations within the publishing industry under digitalization, focusing on three key operational dimensions: content production, product forms, and marketing distribution. These transformations are driven by a confluence of factors, including technological innovation, policy guidance, and shifting market demands. In terms of content production, technologies such as artificial intelligence and big data are reshaping the logic of topic selection, editing, proofreading, and even content generation, facilitating a shift from experience-driven to data-driven approaches and from human-led to human-machine collaboration. Regarding product forms, publications have evolved from static print-based texts into dynamic, interactive, and multimedia digital products, advancing toward immersive experiences through technologies like virtual reality and augmented reality applications. In the realm of marketing and distribution, precision marketing based on the analysis ofuser behavior data is gradually becoming mainstream, while distribution channels have expanded from traditional physical networks to a multidimensional system integrating online and offline platforms with diversified platforms. In response to these ecological shifts, this study proposes a three-dimensional transformation pathway for publishing units, centered on business process reengineering, organizational restructuring, and talent structure optimization. Business process reengineering emphasizes the full digitalization and intensification of workflows, advocating for the establishment of user-centric, data-driven, and networked operational processes. This approach moves beyond linear production to agile, iterative methodologies that enhance efficiency and responsiveness. Organizational restructuring calls for the development of flexible, flat, and cross-departmental collaborative mechanisms. Such structures are designed to dismantle traditional silos, foster innovation, and support rapid iteration and experimentation, enabling seamless coordination across functions to adapt to dynamic market conditions and technological advancements. Talent structure optimization highlights the critical need to cultivate and attract interdisciplinary professionals with both content expertise and technical proficiency. Emerging roles such as product managers, data analysts, and user operations specialists are essential to bridging the gap between traditional publishing competencies and digital innovation demands. Furthermore, strategic leadership with a vision for integrating technology and content is paramount to guiding sustainable transformation. In conclusion, this paper provides a comprehensive and practical framework to support traditional publishing units in their strategic and operational evolution within the digital landscape. By addressing these core dimensions of processes, organization, and talent, publishing units can transition from traditional content providers to dynamic builders of knowledge service ecosystems, thereby enhancing their competitiveness and ensuring sustainable development in the digital age.
Publication ethics represents the institutional embodiment of research integrity within academic publishing. Its fundamental goal is to safeguard the authenticity and fairness of scientific outputs. With the rapid development of artificial intelligence generated content (AIGC), the academic publishing ecosystem is undergoing profound transformation. While AIGC offers unprecedented efficiency and innovation, it also introduces new ethical risks, such as the blurring of authorship responsibility, increasingly realistic fabrication of data and images, and the intelligent automation of plagiarism and textual manipulation. These emerging challenges have exposed several weaknesses in China’s academic journal system, including insufficient regulatory coverage, delayed detection mechanisms, and limited accountability tracing in the construction of publication ethics. Taking the AIGC context as the point of departure, this paper systematically analyzes the new characteristics of academic misconduct and the practical dilemmas faced by scholarly journals in ethical governance. It then proposes a three-dimensional pathway for improvement—through institutional, technological, and educational-cultural perspectives. At the institutional level, it is crucial to strengthen alignment with national research integrity policies, refine authorship and disclosure standards, and establish comprehensive systems for detecting misconduct and enforcing accountability. Furthermore, cross-journal and cross-disciplinary collaboration should be promoted to enhance collective governance capabilities and ensure consistent ethical standards across the publishing ecosystem. At the technological level, journals should develop and implement multimodal detection platforms capable of identifying AI-generated or manipulated content in text, data, and images. Pilot projects in high-risk disciplines—such as biomedical sciences, computer vision, and social data analytics—should focus on verification of results and sharing of information on suspected misconduct. The integration of AI-assisted forensic tools and blockchain-based record-keeping may further enhance transparency and traceability throughout the publication process. At the educational and cultural level, differentiated ethical training programs should be implemented for key stakeholders, including graduate students, editors, and peer reviewers. These programs should emphasize the identification of AIGC-related risks, the responsible use of AI tools, and adherence to publication integrity principles. By cultivating a shared culture of ethical awareness and professional accountability, journals can move beyond reactive regulation toward proactive ethical governance. This study argues that only through the synergistic interaction of institutional constraints, technological support, and cultural guidance can the publishing community effectively address the ethical challenges posed by AIGC. This multidimensional approach will not only help preserve the credibility and integrity of academic publishing but also provide sustainable support for the broader construction of a national research integrity system. In the era of intelligent content generation, reinforcing publication ethics is no longer a peripheral concern but a foundational requirement for ensuring that scholarly communication continues to serve truth, transparency, and public trust.
China’s journal publishing industry stands at a critical juncture as the country moves from the 14th Five-Year Plan period (2021–2025) to the upcoming 15th Five-Year Plan. This transition coincides with accelerated modernization, rapid diffusion of data- and AI-driven technologies, and an increasingly complex global environment. Against this backdrop, this study examines how Chinese journals can reposition themselves from a primarily print-based, administratively driven system to a more market-responsive, digitally integrated and internationally competitive knowledge infrastructure. Drawing on policy text analysis, industry statistics and case studies of leading journal groups, this study first reviews key developments since 2021 in publishing operations, quality evaluation, structural layout, media convergence as well as the policy and funding framework. It identifies clear progress in terms of citation impact, clustering of high-quality titles, and the emergence of digital platforms and data services. Moreover, it highlights persistent challenges, including homogeneous editorial positioning; limited economies of scale and brand building; shallow forms of "digitization" that stop at format conversion; and a shortage of interdisciplinary talent that combines content expertise with data, product and technological capabilities. This paper argues that, guided by Xi Jinping Thought on Culture and China’s agenda for high-quality development, the 15th Five-Year Plan period calls for a redefinition of the core value of journals: from carriers of discrete articles to nodes in a national knowledge and innovation network.Specifically, this requires supply-side reform that prioritizes problem-oriented content, clearer field segmentation, and differentiated journal portfolios in key disciplines; systematic development, standardization and governance of journal data resources as strategic assets; and the deep embedding of artificial intelligence (AI) into peer review, production workflows, dissemination and user services on the basis of high-quality, responsibly governed Chinese-language corpora. Furthermore, this paper stresses the need to strengthen academic clusters and journal alliances in priority fields to overcome structural fragmentation, enhance international visibility and support open science practices compatible with China’s regulatory environment. Finally, it advocates an integrated approach to talent pipelines and governance modernization, combining editorial professionalism, technological literacy and managerial skills and aligning incentives for universities, research institutions and publishing organizations. By situating China’s journal reforms within broader debates on platformization and AI in scholarly communication, this paper offers a case study of how a large emerging knowledge system seeks to coordinate public value and market logic in the next phase of journal development. The findings may also be of interest to policymakers and publishers in other emerging markets facing similar pressures to upgrade their journal systems in the AI era.
During the 14th Five Year Plan period, China's publishing industry achieved good results in terms of theme publishing, digitalization, integrated development, and the international dissemination of publications and Chinese culture. However, overall, it is still in the adjustment and recovery period after the epidemic. The so-called adjustment and recovery include both the intangible adjustment and recovery of the market, as well as the active adjustment and recovery of the industry. During this period, there were both unstable and positive development phenomena, and it is worth conducting in-depth research on the underlying reasons, revealing problems, understanding the essence, and capturing the laws of development. On the basis of industry report data, including the results of the 22nd National Reading Survey, the press and publication statistical bulletin of 2023, the annual report of China's digital publishing industry from 2023 to 2024, the annual report of China's digital publishing industry from 2024 to 2025, the Statistical Report on China's Internet Development, the report on China's publishing industry development from 2023 to 2024, and the report on integrated publishing development (2025), this paper analyzes the main characteristics of the recovery period of the "14th Five-Year Plan" adjustment, including the following: first, the overall growth trend of the publishing market is not stable, and the adjustment is more important; second, digital publishing shows a rapid growth trend with multiple styles, revealing the reasons for adjustment and the direction of recovery; third, the national reading rate has steadily increased, and the digital reading rate has grown rapidly, becoming the foundation for the recovery and revitalization of publishing. The continued positive development of the "15th Five-Year Plan" clearly has its foundation and basis, and the cooperation between the government, industry, academia, and research sectors has played its due role. The development of the publishing industry should maintain strategic determination and development confidence. On this basis, this paper proposes strategies and paths to benefit from integrated development, content innovation, and operational efficiency in response to the diversion of various digital media and products into the book market, as well as the narrowing of revenue and profits in the publishing industry caused by low-priced sales of content e-commerce. The aim is to provide a reference for the comprehensive recovery and revitalization of the publishing industry in the 15th Five-Year Plan period. The 14th Five-Year Plan has passed, and the "adjustment period" may continue. Although the length of the "adjustment period" is influenced by many factors, it is inseparable from the efforts and actions of publishing companies. Publishing companies can only move from adjustment to recovery and revitalization as soon as possible by adapting to market adjustments and reshaping, taking the initiative to make self-adjustments, and aligning self-adjustments with market adjustments. Overall, the publishing industry and publishing enterprises have reason and confidence to look forward to an ambitious and hopeful "15th Five-Year Plan" on the basis of the efforts made in the 14th Five-Year Plan and continue to contribute their due responsibilities to build China into a leading country in publishing and culture.
To clarify the role of universities in cultivating world-leading scientific, technical and medical (STM) journals, the research framework of "actuality—desirability—actualization" was constructed. First, the traditional roles of universities in the production and dissemination of scientific papers were analyzed. Second, the reshaped roles of universities in the process of cultivating world-leading STM journals were proposed. Finally, practical pathways for universities to reshape their new roles are provided. At the actuality level, universities have long played dual roles as authors and readers in the ecosystem of STM journals, with a single dimension of participation. As core authors of STM journals, universities possess advantages in disciplines and talent, and are the main source of China's scientific papers. However, universities publish a large number of high-quality papers in international journals, which not only leads to the outflow of manuscript sources, but requires them to pay high Article Processing Charges (APCs). As core readers of STM journals, universities need to subscribe to databases and journals covering various disciplines to enable teachers and students to grasp the latest research trends. However, internationally academic journals have high subscription fees each year, and universities also face the barriers of English language and data security risks. At the desirability level, with the acceleration of the cultivation process of world-leading STM journals, the role of universities continues to expand, and they should focus on being active cultivators of world- leading STM journals, independent evaluators of STM journals, and important supporters of international academic discourse power. As important sponsors of STM journals, universities should actively take responsibility for cultivating world-leading STM journals, transforming academic resources and scientific achievements into the driving force for journal development. As important components of academic research, universities should actively promote the reform of academic evaluation and actively construct a new evaluation paradigm for STM journals that conforms to the laws of academic development and China's reality. As important participants in building China into a strong country in education, universities should actively transform their disciplinary advantages into discourse advantages and establish a solid position in the global academic governance structure through the virtuous cycle of "discipline—journal—discourse power". In the actualization path, universities need to adhere to the "four orientations" from the perspective of scholars, "writing papers on the motherland's land", and breaking the development dilemma of "both ends being outside" from the root. From the perspective of sponsors of STM journals, they need to break the barriers of dispersed resources within the university, build a collaborative development ecosystem of "discipline—talent—journal", and continuously improve the quality and influence of university STM journals.
With the significant advancements in digital technology and the growing diversity of user demands, the publishing industry—an essential force in cultural development during China's 15th Five-Year Plan—is confronting challenges, including reliance on a traditional business model, lengthy processes, and an overwhelming amount of information. However, digital technology also brings transformative opportunities, such as manifold content presentation, intelligent production methods, and precise dissemination. Promoting in-depth integrated development in the publishing industry has become an essential pathway for its transformation and for fulfilling the cultural mission of the new era. This involves not only a simple addition of content and technology but a comprehensive reshaping of elements, processes, and ecosystems. In recent years, the "content–platform–service" model has emerged as a universal paradigm for the integrated development of the publishing industry. The model focuses on high-quality content and relies on digital platforms to extend knowledge services, enhance user engagement, and facilitate immersive applications, thereby promoting the construction of an open, interactive, and sustainable knowledge service ecosystem that represents a significant leap from knowledge dissemination to value creation. This paper examines the "content–platform–service" integration model through industry cases, particularly focusing on the practices of the Central China Publishing & Media Group. It provides pathways to enhance core functions and improve the competitive advantages of the publishing industry in the digital age. In exploring this model, the interaction between books and courses fosters resource development, enhances editorial capabilities, and increases the value of books. Additionally, services evolve toward scenario-based, experiential formats, creating a continuous progression from consumption scenarios to reading and then further into new consumption scenarios. The platform serves as a hub for resource aggregation and distribution, a center for data accumulation, and a bridge within the industrial ecosystem, providing robust support for upgrading content and services. Moreover, the role of editors is evolving from mere processors of submitted manuscripts to project managers engaged in various types of professional activities. The publishing processes are shifting from the traditional linear flow of "editing followed by disseminating" to a more complex model that includes both the "disseminating followed by editing" approach and the "simultaneous editing and disseminating" approach. In conclusion, this paper highlights that achieving deep integrated development in the publishing industry during China’s 15th Five-Year Plan requires continuous enhancement of technological empowerment and innovative mechanisms. It emphasizes the need for strengthened collaboration throughout the entire value chain, the accelerated development of courses, and the enrichment of cultural experience scenarios. Additionally, it calls for deepening copyright value extension, advancing digital transformation, and refining the modern industrial system. Publishers should uphold mainstream ideology, core values, and authoritative expressions of high-quality content, ensuring user engagement through content activation in the digital age while meeting the public's evolving expectations for spiritual and cultural life.
Data elements are the data resources that people invest in the production of goods or services, and they create value in various business scenarios through aggregation, integration, and collaboration. Since China first explicitly listed data as a production factor in 2019, highlighting its function and status as a new type of production factor, it has become a consensus that data elements are an important component of new quality productivity in the digital intelligence era and a fundamental resource for creating value. The value creation of publishing enterprises refers to the process by which enterprises create economic and social value by providing knowledge products or services, and it is the core driving force for the existence and development of enterprises. Data elements are closely related to modern enterprises and are an important resource driving the value creation. As data-intensive enterprises, the role of data in publishing companies is becoming increasingly important. Specifically, data elements are the fundamental elements for the digital transformation and integrated development of publishing enterprises in the digital intelligence era. They empower publishing enterprises to create social, economic, and innovative value with their unique characteristics. The mechanism by which data elements empower the value creation of publishing enterprises lies in the exertion of their innovation mechanism, collaboration mechanism and predictive mechanism. However, current practical predicament of data elements empowering the value creation of publishing enterprises in China is reflected to varying degrees in the ontological, subject and environmental aspects of the value creation of data elements. To address these challenges, efforts should be made in three aspects: standardizing and balancing the collection and effective utilization of publishing data elements; establishing a data concept among publishing enterprises, setting up a publishing data system and enhancing the level of value transformation, as well as improving data security system guarantees, increasing infrastructure investment and strengthening data technology capabilities. These measures will further enhance the value creation capacity of publishing enterprises. In conclusion, the integration of data elements into the deep integration of publishing not only reshapes the production processes of publishing enterprises and enhances production efficiency, but also provides a powerful data-driven force for the creation of publishing value. Publishing data elements are leading the publishing industry towards a brand-new stage of digital transformation and deep integration with unprecedented power, thus making the collection, integration and mining of publishing data elements a new paradigm and new driving force for the creation of publishing value. Therefore, delving deeply into the underlying logic of how data elements empower the value creation of publishing enterprises, and recognizing the practical challenges and relief pathways associated with the role and value of data elements in the knowledge production, knowledge services, and business management activities of publishing enterprises, undoubtedly becomes an urgent issue that publishing enterprises need to solve to develop new-quality productive forces and achieve high-quality development.
Under the new circumstances of deepening globalization, the interweaving and collision of diverse cultures, and the digital transformation of education, engineering textbooks serve as the core knowledge carriers for cultivating engineering talents in universities. They not only fulfill the functions of imparting specialized engineering knowledge and fostering students' practical engineering skills but also shoulder the ideological mission of shaping young students' correct worldview, outlook on life, and values. As the primary platform for the publication of engineering textbooks, the quality of ideological work in university presses directly impacts the direction of engineering talent cultivation and the foundation of national innovation and development. On the basis of the author's years of experience in the publishing industry and teaching practice, this paper employs literature research, systematic analysis, and case study methods as core approaches. Grounded in the dual value of ideological work as both an orientation guarantee and core support for engineering textbooks, it systematically addresses four new ideological challenges faced by engineering textbooks publishing in the new era: the increasing subtlety of ideological infiltration leading to difficulties in identification and prevention, digital publishing transformation creating regulatory dilemmas hindering risk control, the development of emerging engineering disciplines posing challenges for value integration and practical implementation, and the diversification of student values raising new demands for value guidance, making precise alignment difficult. This study delves into the underlying causes of issues in the publishing process, such as insufficient ideological review coverage, weak review personnel capabilities, lack of proactive engagement, and outdated review mechanisms, and clarifies the core requirements for ideological development in engineering textbooks. Combined with practical exploration, this study proposes four-dimensional practical pathways: first, university presses must uphold the fundamental principle of "the Party overseeing ideological work" to improve review mechanisms; second, enhancing oversight capabilities can be achieved by strengthening editors' ideological acuity and handling skills, leveraging parent university resources to bolster professional competence, and increasing the weight of ideological performance evaluations; third, value orientation is reinforced by "strict standards and strong guidance"; and fourth, educational outcomes are enhanced by "innovation emphasis and targeted effectiveness". This study clarifies the core contradictions and key focal points of ideological work in engineering textbook publishing. The findings provide solid theoretical references and actionable practical insights for university presses to fortify ideological security defenses, address pain points and challenges in the ideological construction of engineering textbooks, and promote the high-quality development of such textbooks. This supports universities in cultivating high-caliber engineering talent with both professional competence and a correct value orientation through superior engineering textbooks, thereby contributing to the national innovation-driven development strategy.
Short video platforms have developed as pivotal arenas for academic communication and public engagement. Consequently, a growing number of academic journals have established official accounts on these platforms to disseminate research outputs and enhance public visibility. This trend has attracted considerable scholarly attention, with existing studies mainly focusing on science and technology journals. In contrast, the digital practices of social science journals on short video platforms remain largely unexplored. This study therefore aims to develop a comprehensive analytical framework for digital narratives that integrates subject, mechanism, and effect, and applies it to a multi-platform "narrative field" encompassing the top 30 CSSCI journals ranked by WeChat public accounts dissemination data. A cross-platform comparative analysis is conducted using data collected from WeChat Channels, Bilibili, and Douyin. This study conceptualizes the dissemination as a complex system with three elements: the dynamics primarily driven by the authority and credibility of the accounts, the specific translation pathways and stylistic narrative strategies employed in the process, and the audience reception and engagement. The first element is related to audience attention, the second one determines the depth and sustainability of audience engagement, and the third one refers to audience feedback and dissemination effectiveness. Together, these elements form a dynamic "content-context-feedback" chain. Narrative subjects employ distinct discourse to reconstruct academic authority and garner attention. They utilize various narrative mechanisms to encode and recode academic content, balancing comprehensibility with scholarly credibility. Simultaneously, platform algorithms modulate the visibility and interactivity of academic short videos. This study reveals several bottlenecks in current digital practices of social science journals. Regarding narrative subjects, there is a notable lack of official verification, compromising the perceived credibility. The narrative process often fails to provide deep interpretation of academic content in some circumstances, and the current dissemination strategies are highly homogenized. Concerning the narrative effect, user feedback remains sparse, and audience stickiness is weak. Enhancing the effectiveness of the digital narrative practices needs a concerted endeavour. We suggest that social science journals pursue a further overall improvement across three core elements: authority, comprehensibility, and effectiveness. This necessitates strengthening the authority of the narrative subjects, diversifying the dissemination strategies, enhancing the innovation and differentiation of narrative mechanisms, encouraging audience participation through a combination of expert-generated-content and user-generated-content and developing an agreeable framework to measure the conversion of platform traffic into knowledge adoption, including journal subscriptions and citations. In conclusion, this study shows that the interrelationship between subject and audience has the potential to be a significant perspective in analyzing the digital transformation of social science knowledge and ideologies. We suggest that social science journals develop a diversified cross-media portfolio of strategies: print publications to ensure academic depth, WeChat official accounts to increase visibility, and short video platforms to broaden social reach—thereby achieving both greater depth and expanded breadth. This study advances theoretical and empirical understandings of digital publishing and provides actionable guidance for social science journals to cultivate a sustainable and recognizable knowledge brand on short video platforms.
A competency map is a visual graphic used to present the progression and structural relationships of vocational competencies. Leveraging digital technology, it describes the connotative elements, related resources, and their carriers of vocational competencies, playing a significant role in mining, analyzing, constructing, mapping, and displaying multilevel vocational competencies and their interrelationships. Currently, competency maps have become a key technology for promoting the digital transformation of vocational education in the era of artificial intelligence and serve as a core instrument in the development of digital textbooks in vocational education. Employing research methods such as literature analysis and logical reasoning, this study focuses on the dynamic mechanisms and practical obstacles in the competency map-driven construction of digital textbooks in vocational education, aiming to propose targeted implementation pathways. The driving forces behind competency map-driven construction of digital textbooks in vocational education are diverse and hierarchical, covering three dimensions: institutional, practical, and technological dynamics. Specifically, in terms of institutional dynamics, a series of policies guide the development of digital textbooks in vocational education; in terms of practical dynamics, digital transformation facilitates the development of digital textbooks in vocational education; in terms of technological dynamics, cutting-edge technologies empower the upgrading of digital textbooks in vocational education. However, in practical application processes, due to challenges such as ambiguous teaching objectives, high difficulty in content integration, significant disparities among teaching modules, and excessive technological intervention in vocational education digital textbooks, the driving process of competency maps faces practical obstacles including impacting the cultivation of students' vocational literacy, hindering the construction of students' knowledge systems, obstructing students' holistic grasp of knowledge and skills, and affecting positive teacher-student interaction and emotional connection. Therefore, four strategic pathways are proposed to promote the implementation pathways for competency map-driven construction of digital textbooks in vocational education. First, anchor vocational literacy development by setting clear teaching objectives for vocational education digital textbooks. Second, construct a vocational education knowledge system by developing systematic content for vocational education digital textbooks. Third, efficiently integrate learning resources to develop blended teaching modules for vocational education digital textbooks. Fourth, strengthen teacher-student teaching interaction and optimize the technology-driven mechanism of vocational education digital textbooks.
In recent years, generative artificial intelligence (AIGC) has experienced rapid development and widespread application in the publishing sector. As an emerging technological application and new quality productivity, AIGC is reshaping the publishing ecosystem, catalyzing profound transformations in content production and publishing workflows. How to deal with the subversive changes brought by generative artificial intelligence to the publishing field is an important issue that needs urgent academic attention and analysis. Existing studies have explored its application prospect, potential risks, ethical compliance, copyright protection, governance and supervision of AIGC in the publishing industry. Based on literature analysis, this paper focuses on the technical logic and implementation paths of generative artificial intelligence empowering publishing industry from the perspective of marketing theory. The application of generative artificial intelligence in publishing marketing enables precise market predictions and facilitates efficient decision-making in publishing marketing. Leveraging robust data processing and self-learning capabilities, it constructs precise user profiles by analyzing multidimensional data, including user behavior and purchasing preferences. Furthermore, generative artificial intelligence autonomously generates personalized textual content aligned with user preferences, thereby elevating reading experiences. Through model training based on user portraits, the algorithm can accurately identify user demands across various scenarios, thereby enabling more precise and personalized publishing products recommendations. Regarding specific practical paths, this paper proposes that generative artificial intelligence plays a significant role in the planning and text processing of publishing topics, personalized promotion and targeted distribution of publishing marketing materials, automated marketing copy generation, and creative design of book industry advertisements, as well as intelligent marketing customer service and emotional services. First, in topic planning, generative artificial intelligence builds an intelligent publishing topic planning system through multidimensional data fusion analysis, enhancing editorial efficiency. Second, in text content processing and optimization, AIGC's multi-modal depth fusion ability can realize the intuitive transformation of text content into visual images. Third, based on user portraits, generative artificial intelligence can predict user preferences and interests in advance and accurately recommend book products through precise identification and matching. Fourth, on the basis of generating the book content summary, publishing operators can also utilize AIGC to generate book covers and summaries, and further explore and refine the marketing selling points of the book, thus enhancing the marketing advantages of the book. Additionally, AIGC-powered intelligent customer services and emotional engagement enhance user emotional experience, and build deep trust between enterprises and users. As emerging technology is a "double-edged sword", how to avoid the potential risks that may be brought by generative artificial intelligence requires careful consideration. This paper advocates for balanced implementation with actively promoting AIGC applications in publishing and marketing, while mitigating potential risks in applications. Therefore, adherence to the principles of "ethics first" and "technology for good" is essential, embedding ethical principles such as fairness, justice, and honesty into model training to promote the transformation of publishing marketing toward a new paradigm of "human-machine collaboration".
The development of internationally influential scientific, technical, and medical (STM) journals that align with China’s technological advancement depends fundamentally on establishing an editorial team with a “craftsmanship spirit.” An analysis of the contemporary connotations of craftsmanship spirit in the context of STM journals reveals its dual essence: A pursuit of values and professional competence. Social cognitive theory, particularly its triadic reciprocal determinism provides a valuable perspective for understanding the interactions between individual factors (such as self-efficacy and values), environmental factors (institutional frameworks, cultural context, and physical infrastructure), and behavioral outcomes (experience and practice). In the cognitive dimension (individual factors), value pursuit manifests through editors' role as scientific gatekeepers and their professional dedication, while professional competence requires self-efficacy-driven knowledge iteration and skill enhancement, forming a dynamic mechanism where value orientation and capability development mutually reinforce each other. The behavioral dimension demonstrates value pursuit through standardized operating systems and cultural construction, with professional competence reflected in lean practices such as technological integration and continuous learning, extending value through technical behaviors. In the environmental interaction dimension (environmental factors), value pursuit requires editors to uphold academic missions amid institutional reforms and integrate resources to build academic communities; meanwhile, professional competence necessitates addressing emerging challenges like open science and intelligent publishing. This involves constructing a "technology-content-service" system, deeply embedding emerging technologies like artificial intelligence and big data into the entire publishing process to achieve innovative breakthroughs. Based on social cognitive theory and combined with the key elements of building world-class scientific journals in China, this paper explores the new requirements of high-quality development for editors’ craftsmanship spirit. These are reflected mainly in three dimensions: Cognitive upgrading—redefining the editor’s role in alignment with national strategy; behavioral enhancement—acquiring and applying interdisciplinary knowledge; and environmental adaptation— publishing innovation in the context of technological evolution. Editors must closely integrate their work with national strategic priorities, ensuring that journal content supports both national objectives and guides the development of disciplinary research. It is essential for editors to construct a system for learning and applying cross-disciplinary knowledge, relying on an interactive “editing-learning-research” mechanism to enable continuous renewal of their knowledge structures. Taking the publishing practice of Science and Technology Review Pubblishing House as an example, this paper proposes that the main approaches for STM journal editors to practice craftsmanship spirit for high-quality development include: Deepening understanding of national strategies while building a quality assurance system; establishing a dynamic knowledge updating system to enhance academic services; and innovating publishing models to promote the transformation and upgrading of journals. In conclusion, the craftsmanship spirit, characterized by strong value commitment and professional excellence, remains fundamental to the high-quality development of China’s STM journals.
This article employs an integrated methodology that combines quantitative and qualitative research, unifies theory and practice, and reconciles facts with values, to conduct an in-depth analysis of the issues pertaining to journal quality. It investigates the phenomena of ethical misconduct within these quality problems and explores the underlying causes for such ethical lapses among publishing stakeholders. Consequently, it offers reflections on pathways for constructing moral norms for these entities. Journal quality encompasses four key dimensions: content, editing and proofreading, publication format, and printing quality. A deficiency in any of these areas can result in substandard journal quality. Underlying these quality issues are often instances of ethical misconduct by publishing stakeholders. Content quality problems reflect scientific integrity breaches and responsibility shirking; editing and proofreading issues indicate weakened political awareness and responsibility dependency; while problems with publication format and printing quality point to profit-driven motives and the neglect of responsibility. Such ethical misconduct can significantly impact journal quality, potentially leading to the publication of articles containing errors in political direction, public opinion guidance, value orientation, or numerous textual inaccuracies. The publishing stakeholders involved in these ethical issues primarily include authors, peer reviewers, editors-in-chief, editors, proofreaders, designers, and printers. Analyzing the reasons behind these ethical failures is crucial for establishing practical moral norms for publishing stakeholders. The primary causes of ethical misconduct among publishing stakeholders include institutional gaps, external environmental pressures, challenges posed by artificial intelligence (AI), and conflicts in moral decision-making. Institutional norms that are ambiguous and weakly enforced, the utilitarian orientation and a crisis of trust induced by environmental pressures, and the challenges of emerging AI technologies leading to misuse and blurred ethical boundaries are primarily objective factors. Conversely, subjective factors predominantly involve conflicts in moral decision-making, including the stakeholders' cognitive biases regarding their roles, conflicts in behavioral motivations, and the weakening of moral self-discipline. Establishing comprehensive moral norms for publishing stakeholders and facilitating their internalization into self-regulatory practices are pivotal to enhancing journal quality. The construction of moral norms for publishing stakeholders should, first and foremost, establish a solid ethical foundation by clarifying the core principles and value orientations of these norms. The formulation of such norms must align with the core socialist values, conform to fundamental professional ethics in China, and promote the nation' s fine traditional virtues. Second, institutional support must be strengthened through enhanced heteronomous regulations and external constraints. This requires concerted efforts from various actors within the publishing industry, including national and provincial publishing administration departments, the supervising and sponsoring institutions of journals, publishing units, authors' institutions, and industry associations. Third, it is essential to stimulate endogenous motivation by fostering the self-cultivation of moral autonomy and cultural consciousness among publishing stakeholders. This involves a commitment to continuous learning and self-renewal, enhancement of professional dedication and personal integrity, practice of the unity of knowledge and action, and adherence to the principle that literature should convey truth and morality. Through these approaches, an organic unity of external regulation and internal self-discipline can be achieved, thereby promoting the sustained improvement of journal quality.
Nations worldwide are actively developing and leveraging data resources both domestically and internationally. These resources exhibit economic characteristics including externalities, non-rivalry, and non-excludability, alongside sociological attributes such as shareability, spatiotemporal relevance, and public accessibility. Data quality serves as a critical determinant of model performance in generative artificial intelligence (GenAI) systems, and the lack of high-quality training datasets remains a significant challenge across sectors. While previous research on data elements has focused on implementation aspects, this study examines the underlying rationale and methodologies. This paper establishes the connotation and extension of high-quality datasets, identifying four quality dimensions within a three-dimensional six-tier analytical framework; 1. Structural Dimension; 2. Spatiotemporal Dimension; 3. Security Dimension: The core requirements for constructing high-quality datasets are categorized into four dimensions; 1. Data Unit Level; 2. Dataset Level; 3. Social Benefit Perspective; 4. Economic Benefit Perspective: This framework integrates technical specifications with governance principles, addressing both operational efficiency and societal value creation. The analysis examines industry-specific characteristics and resource endowments to demonstrate why the publishing sector holds unique social responsibility in constructing high-quality datasets. Publishing data exhibits inherent advantages: 1. Quantity: Rich diversity of types and abundant reserves; 2. Quality: Rigorous supply mechanisms and strict review processes; 3. Externality: Traceable ownership and privacy clearance; 4. Standardization: Technical support and cross-referencing capabilities. At the data unit level, publishing data undergoes comprehensive peer review and expert verification, ensuring superior accuracy and reliability compared to alternative data sources. Publishing data achieves substantial completeness and richness through comprehensive industry coverage. At the dataset level, professional editorial teams facilitate secondary knowledge production during data aggregation. They integrate technology with publishing workflows in processes such as packaging, delivery, error correction, and iterative updates, establishing sustainable version control mechanisms. Regarding benefits, publishing data inherently features desensitization and alignment with mainstream ideological values, addressing the balance between data protection and public accessibility. Moreover, the publishing industry's established ownership tracing and benefit distribution mechanisms provide a foundation for business evolution, facilitating trust networks and incentive-compatible business models between data providers and users. From a meso-theoretical perspective, this study employs a best-practice approach, examining mature image databases in the digital copyright trading industry as case studies. It analyzes principles and methodologies for constructing high-quality datasets, proposes operational and training recommendations, and achieves alignment between theory and practice. The marginal contributions of this paper are threefold: first, clarifying the scope and definition of high-quality datasets; second, analyzing the publishing industry's characteristics and advantages to identify key stakeholders; and third, recommending standards, operational principles, and construction methods for high-quality datasets.
The deep integration of generative artificial intelligence (AI) with publishing is driving a paradigm shift in the industry, fundamentally transforming its underlying logic and overarching forms. Against this backdrop, this paper introduces and systematically elaborates on the novel concept of "generative publishing". Generative publishing is defined as an emerging publishing paradigm that utilizes large language models as its technical foundation, is led by publishing institutions, adopts human?machine collaboration as its core content generation logic, and aims to produce generative publications and services tailored to users’ personalized needs. Its conceptual essence can be deconstructed across five dimensions: it relies on large language models as the technical base for autonomous content production; it redefines the role of publishing institutions from intermediaries to leaders in content creation, engaging directly with users; it prioritizes human agency within human?machine collaboration to ensure content quality and value alignment; its outputs manifest in two primary forms—generative publications and generative publishing services; and collectively, it signifies a distinct developmental phase beyond traditional publishing formats. On the basis of the dynamics of human?machine collaboration, the extension of generative publishing can be categorized into three types: (1) Controlled generative publishing: Human professionals retain full control, provide precise instructions and perform rigorous quality checks. The large language model functions as an advanced tool within a tightly defined human-led framework. (2) Interactive generative publishing: Characterized by a bidirectional feedback loop, this model involves iterative co-creation. The large language model generates content and performs initial assessments, while human experts provide selective calibration and creative input. (3) Autonomous Generative Publishing: The large language model operates with substantial independence within predefined objectives and ethical boundaries. Human oversight shifts from direct process control to ex-post review, exercising veto authority while relying on embedded governance rules. To operationalize this paradigm, this paper constructs an actionable framework comprising three core components: (1) The publishing large language model, serving as the technical cornerstone, developed through strategic planning, data preparation, training and fine-tuning, and deployment and maintenance. (2) Generative publications as the core products, including compiled types (intelligent reorganization and systematization of existing knowledge), derivative types (creative transformation of classical content), and original types (dual innovation in both conceptual and formal dimensions). (3) Generative publishing services, embodying the knowledge-service attribute, with representative forms such as knowledge service robots and intelligent reading interaction interfaces. Generative publishing represents the industry’ s evolution toward a dynamic, intelligent, and knowledge-service-oriented ecosystem. While promising, its advancement inevitably confronts challenges related to copyright, authenticity, professional restructuring, and ethical governance, necessitating a balanced approach that fosters innovation while upholding the core values of publishing.
The international scholarly communication ecosystem is undergoing a profound transformation. This study examines the strategic responses of five leading academic publishers—Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE—from 2022 to 2025. By analyzing corporate annual reports, official press releases, policy documents, and industry whitepapers, this research identifies three interconnected trajectories reshaping the global publishing ecology. First, strategic transformation and business restructuring have accelerated. While overall revenues for most publishing groups remain resilient or show growth, the compositions of these revenues reflect a decisive pivot away from traditional subscription-based publishing. Traditional subscription-based publishing is approaching saturation, evidenced by stable or slightly declining shares of total revenue. Growth is now driven by diversification strategies: aggressive acquisitions target adjacent services such as research workflow tools, data analytics platforms, and artificial intelligence (AI)-driven solutions; divestitures of non-core assets sharpen focus on scholarly research; substantial research and development (R & D) investment fuels AI integration across discovery, peer review, writing assistance, and plagiarism / image detection; and strategic expansion into emerging markets (Asia, Latin America, and the Middle East) is vigorously pursued through transformative agreements and localized partnerships. Second, the transition toward Open Access (OA) is steadily accelerating, albeit accompanied by ever-changing models and emerging challenges. There has been a notable surge in the signing of Transformative Agreements, peaking around 2023. These agreements have emerged as the favored approach for facilitating large-scale OA conversion, typically structured as multi-year, consortium-based "Read & Publish" deals, such as the DEAL agreements in Germany. Publishers also experiment with alternative OA pathways like Subscribe to Open, Pledge to Open, Direct to Open, Flip it Open, and Purchase to Open, as well as collective funding models and diamond OA initiatives. Efforts are made to include smaller institutions in OA frameworks. However, sustainability concerns persist. Article Processing Charge (APC) inflation continues, with differential pricing based on Creative Commons licenses or speed of service emerging. Third, engagement in scholarly ecosystem governance has intensified. Recognizing their pivotal role, publishers are investing heavily in research integrity infrastructure: developing and deploying AI tools to detect paper mills, image manipulation, and AI-generated text; participating in global initiatives like United2Act; and collaborating on authorship guidelines. Service optimization for authors and librarians includes AI-powered peer review matching, manuscript transfer services, reduced decision times, writing assistants, code sharing integration, and training resources for AI literacy. Drawing on these international experiences, this paper proposes strategic recommendations for China's academic publishing industry. It is suggested that China should leverage its institutional advantages to build autonomous and controllable intelligent publishing platforms through technological empowerment. Regarding the OA pathway, prioritizing non-APC/BPC models, such as diamond OA and collaborative funding mechanisms, is crucial for developing a sustainable and inclusive system. Furthermore, in terms of governance mechanisms, proactive participation in international rule-making is essential to enhance China's discourse power in global academic communication. The ultimate goal is to construct a new academic communication ecosystem that is open, sustainable, and globally influential.
Scientific, technical and medical (STM) journals serve as vital platforms for disseminating scientific knowledge and advancing technological innovation. However, due to various negative factors, the publishing environment for Chinese STM journals has deteriorated, significantly restricting their sustainable development. It is evident that STM journals do not exist in isolation; rather, the publishing environment plays a crucial role in their growth and survival. Ecology, as a discipline, examines the interrelationships among organisms and between organisms and their environment. Publishing ecology extends these ecological theories to the study of publishing systems, serving as an interdisciplinary framework that investigates the laws and mechanisms governing the interactions between publishing media and the environmental factors influencing their survival and development. From the perspective of publishing ecology, this study first elaborates on the interdependent and interactive relationship between STM journals and their publishing environment. It then examines current challenges that STM journals are facing, including the distortion of the academic evaluation system, inadequate enforcement of the peer review system, decentralized journal management structures, and deficiencies in the journal evaluation system. Finally, four ecological construction strategies are proposed to restore balance and promote sustainability: (1) Strengthen enforcement of the peer review system and rigorously verify issues related to research integrity and publication ethics. Adhere strictly to peer review system, ensuring fairness, transparency, and accountability. Reform the academic evaluation system to refocus on the original intention of scholarly inquiry and cultivate a healthy academic ecosystem; (2) Encourage constructive competition among journals, adopt ecological niche strategies, clearly define their positioning, highlight the characteristics and branding of the publications, and develop their own unique competitive advantages. Simultaneously, foster collaboration among journals, follow the path of group development, and build a favorable industry ecosystem; (3) The national publishing administrative authorities should leverage their role in macro-regulation by improving the journal management system and strengthening oversight of journal publishing activities. Establish a scientifically sound and balanced journal evaluation system that comprehensively evaluate journal quality, thereby cultivating a supportive institutional ecosystem; (4) Utilize new media platforms—such as WeChat official accounts and video accounts —to expand dissemination channels, enrich content formats, and improve accessibility for authors and readers. Focus on cutting-edge developments within disciplinary fields, identify emerging research hotspots, and promptly publish breakthrough academic achievements and report the latest development trends of various disciplines, and enhance the journal' s role in disciplinary development. Centering on the major demands and technical challenges of the national economic development, we should carry out thematic planning in a guiding manner to provide a knowledge foundation for breaking through "bottleneck" technologies, and enhancing the journal' s capacity to serve societal needs, thus building a robust service ecosystem. Collectively, these strategies aim to foster harmonious development of STM journals and their publishing environment, offering theoretical insights and practical guidance for promoting the sustainable development of Chinese STM journals and enhancing their academic influence globally.
The digital-intelligent transformation of educational publishing constitutes a systemic reconstruction across legal, ethical, organizational, and technological dimensions. This study addresses the core dilemma confronting K–12 educational publishing institutions: leveraging data-driven approaches to deliver precise and personalized educational services while rigorously safeguarding the privacy of minors—a legally protected category of "sensitive personal information." Employing a "Problem-Attribution-Solution" analytical framework, the research conducts a systematic investigation. By integrating textual analysis of key legal provisions (such as China' s Personal Information Protection Law and Minors Protection Law), scrutiny of industry practices, and feasibility assessments of emerging privacy-enhancing technologies and service models, this paper moves beyond superficial descriptions to provide an in-depth, multi-faceted analysis. The study first delineates the concrete manifestations of this dilemma, revealing inherent conflicts between the scope of data collection and the minimum necessity principle, between deep data utilization and the purpose limitation principle, between rising technical compliance costs and unsustainable profit models, and between agile business development and rigorous compliance procedures. It then diagnoses three critical capability gaps within publishing institutions that underlie these conflicts: (1) an "easier said than done" gap in legal cognition, wherein abstract principles fail to translate into operational rules; (2) a significant technical capability deficit, where legacy systems and the absence of robust data governance frameworks impede secure data handling and the adoption of Privacy-Enhancing Technologies (PETs); and (3) a management system void, characterized by the absence of dedicated roles, comprehensive internal policies, and inadequate oversight of third-party partners. As the core outcome, this paper constructs and elaborates a novel, integrated "Management-Technology-Content" trinity governance framework as a strategic pathway forward—one that embeds the "Privacy by Design (PbD)" principle, a core tenet emphasized in the research, into the entire digital-intelligent transformation process. This framework advocates for proactive, systemic change: Management-wise, it proposes establishing a privacy-first governance architecture with dedicated leadership (e.g., a Data Protection Officer), clear internal ethical guidelines, full-lifecycle data policies, and user-friendly transparency mechanisms. Technology-wise, it evaluates the deployment and application of key PETs—including data masking, strict anonymization, federated learning, and differential privacy—analyzing their operational mechanics, suitability for educational scenarios, and inherent trade-offs. Content-wise, it pioneers a paradigm shift toward service models with low dependency on sensitive personal data, such as modular "knowledge components" libraries for user-led navigation, explicit interactive feedback replacing implicit behavioral monitoring, context-aware marketing based on group trends, and exploration of "data trusteeship" collaborations with authorized third-party platforms. This research concludes that sustainable transformation requires a fundamental rethinking of data value—shifting from exploiting data for predictive control to utilizing it to support learner autonomy and educational equity. The proposed holistic framework aims to provide a theoretically grounded and practically actionable guide for educational publishers to navigate the dual imperatives of innovation and compliance, thereby fostering a responsible, trustworthy, and ethically sound ecosystem for digital-intelligent development in education. The findings underscore that the future lies not in deeper data mining but in building respectful, low-dependency, and trustworthy service paradigms.
The pathways and implementation strategies for scientific, technical, and medical (STM) journals to effectively facilitate the transformation of scientific research achievements are investigated, with a focus on resolving the disconnect between demand and service in the innovation chain and bridging the gap between "paper-based achievements" (research papers) and "production lines" (industrial application). Through literature review, data analysis, and in-depth case studies, it elucidates the functional role of STM journals in achievement transformation and proposes actionable tactics to enhance their service capabilities. This paper identifies three core functions of STM journals in driving transformation. First, as a foundational platform for knowledge dissemination and scientific validation, they authenticate the scientific rigor of research outputs through rigorous peer review, laying a credible groundwork for industrial adoption. Second, as a value-identification tool, their interdisciplinary editorial and review teams, comprising clinical specialists, industry consultants, and researchers, excavate the application potential of achievements, emphasizing economic and societal impact. Third, as a connectivity hub for industry-academia-research collaboration, they organize academic forums, showcase transformation cases, and enable the sequential transformation of "basic research—application development—industrial implementation". Nevertheless, three critical bottlenecks are pinpointed. First, fragmented academic evaluation systems prioritize citation metrics and journal impact factors over transformation potential, resulting in a national scientific research patent conversion rate of merely 6% and a university invention patent industrialization rate of 3.9%. Second, journals lack robust information screening and matchmaking mechanisms: most fail to integrate "clinical—research—industry" needs or specify application scenarios, causing an average 5-year lag between academic publications and industrialization. Third, traditional journals operate solely as "paper publishers" rather than "knowledge hubs, " failing to foster a collaborative ecosystem balancing the "academic prestige focus" of research institutions and the "commercial viability focus" of enterprises. To address these challenges, five targeted solutions are proposed: refine content positioning by incorporating "technology transformation columns" (e.g., Chinese Medical Journal tracking COVID-19 miRNA diagnostic markers) and establishing a "clinical-research-industry" topic linkage mechanism, exemplified by Cancer Pathogenesis and Therapy (CPT) curating interdisciplinary themes; strengthen service capacities via a "basic research + application value" dual-track review framework (adopted by The New England Journal of Medicine) and interdisciplinary editorial boards with clinical, industry, and regulatory experts; build collaborative platforms featuring online transformation zones (e.g., CPT planning an "anti-cancer technology transfer" section) and offline technical matchmaking symposia; advance internationalization through joint special issues with global journals and engagement in worldwide technology networks; and enhance evaluation protocols by monitoring transformation outcomes (e.g., patent acquisition, Regulatory Approval). In conclusion, STM journals must evolve from "paper disseminators" to "knowledge hubs", effectively connecting laboratories and production lines. This evolution accelerates the transformation of scientific research achievements while contributing to the cultivation of new quality productive forces, aligning with national strategies to integrate innovation and industrial progression and operationalizing the principle that "scientists should write the paper on the land of our country" and apply technological achievements to modernization.
In recent years, large models of artificial intelligence-generated content (AIGC) have been extensively utilized in topic selection, editorial processes, and academic dissemination, facilitating the development of an intelligent publishing model for scientific journals and significantly improving the efficiency of academic journal editing, publishing, and multimedia communication. However, the application of AIGC remains characterized with ambiguities, with frequent occurrences of academic misconduct and ethical risks stemming from inappropriate application. This paper investigates the AIGC-related publishing ethics policies of 160 Chinese medical journals included in the World Journal Clout Index (WJCI) of Scientific and Technological Periodicals (2023 edition). It analyzes the AIGC application policies pertaining to authors, editors/peer reviewers, and journals/publishers in Chinese medical journals, and proposes a framework and ethical policy recommendations for AIGC application in medical scientific research, manuscript writing, academic publishing, and dissemination. By integrating AIGC application policies and guidelines from domestic and international publishers and institutions, we propose an ethical framework and policy recommendations for AIGC application across these domains. The findings reveal that a total of 61 medical journals (38.13%) have explicitly addressed AIGC application policies in their publication ethics documents, mainly focusing on (67.21%) the process of paper writing and review; a total of 99 journals (61.87%) lack any AIGC application policies, highlighting a significant gap in the prevalence and enforcement of AIGC ethics standards and management measures in Chinese medical journals. While Chinese medical journals with declared AIGC application policies have reached a basic consensus on the boundaries of AIGC application for authors, variations persist in the AIGC application policies concerning editors/peer reviewers/journal publishers, necessitating further refinement. Consequently, this paper synthesizes and delineates the application scenarios and application frameworks of AIGC technology across the stages of medical scientific research, manuscript writing, academic publishing, and dissemination, aligned with the realities of Chinese medical journals. In response to the issue of non-uniformity in disclosure and declaration policies for AIGC applications, Chinese medical journals should clearly define the application scenarios and application boundaries of AIGC for different stakeholders, regularly organize AIGC training and exchange discussions, and refine policies concerning AIGC application and disclosure. Establishing a review system for AIGC application standards, dynamic oversight, and a tiered punishment mechanism for improper application will foster the standardized AIGC implementation. Future efforts should explore more nuanced application scenarios and application boundaries of AIGC, tailored to the specific characteristics of various medical disciplines and the demands of regulatory entities, to facilitate the standardized integration of AIGC within the realms of medical research, academic publishing, and dissemination.
Based on the practical progress and internal logic of artificial intelligence generated content (AIGC) empowering digital publishing, this study constructs a digital publishing industry chain resilience evaluation model encompassing five key elements: structural links, resilience attributes, resilience sources, institutional guarantees, and resilience levels. The model aims to provide theoretical support and analytical tools for the scientific identification, precise assessment, and systematic enhancement of industry chain resilience in the context of deep AIGC integration and accelerated industry restructuring. It emphasizes the dynamic collaborative evolution among these elements, addressing limitations of current resilience research that often rely excessively on physical boundaries, focuses on a single capital-driven approach, and neglects institutional factors. As such, the model provides a more integrated explanatory perspective for current industry chain resilience research. AIGC is profoundly reshaping the links in the digital publishing chain by overcoming temporal and multi-party constraints, enabling individual entities to complete publishing process anytime and anywhere. This promotes efficient coordination and integration across different links, thereby enhancing the resilience of the industry chain. AIGC also introduces cross-link risks such as copyright disputes, information silos, and user data breaches—that, if accumulated and propagated, may disrupt the operational rhythm of the chain and weaken its resilience. Therefore, a dedicated evaluation model is urgently needed to assess AIGC's dual impact on industrial chain resilience. Existing resilience models are primarily based on either the ecological dynamics-inspired four attributes (4R: robustness, redundancy, adaptability, and rapidity) framework or the sustainable livelihoods-oriented five capitals (5C: human, social, natural, physical, and financial capitals) system, both of which are better suited to traditional industrial chains with clear physical boundaries. Even studies that attempt to integrate 4R and 5C commonly overlook the critical role of legal and policy-related institutional factors. In contrast, digital publishing centers on virtual information space while relying on physical infrastructure for support, resulting in highly blurred boundaries between dynamic and static elements. Moreover, as a key vehicle for ideological dissemination, it carries distinct political attributes and cultural security responsibilities. Consequently, legal and policy-based institutional safeguards are not merely external environmental variables but have become structural elements embedded within the operation of the digital publishing industry chain as well as endogenous variables directly shaping its resilience. To address these gaps, this study proposes a "3L+4R+5C+G+3Le" resilience evaluation model for the digital publishing industry chain. It defines three links (3L): content production, channel distribution, and user consumption, and analyzes the influence of the 5C on resilience. Legal, policy, and other institutional safeguards (G) serve as an endogenous foundation spanning the entire chain. Resilience performance is characterized through 4R. Finally, through weighting and comprehensive measurement, resilience is classified into three levels (3Le: high, medium, and low). This model represents a conceptual shift from "structural analysis" to "state identification, " transforming industrial chain resilience from an abstract notion into an assessable and manageable strategic capability. Future research should focus on developing resilience indicators and exploring pathways to enhance resilience in the digital publishing industry chain.
Popular science journals can conveniently and effectively communicate complex scientific knowledge with the help of short videos, offering unique advantages of enhancing the efficiency of information conveyance and improving readers' understanding. This study examines short videos of popular science journals by adopting the case study method to analyze the visual rhetoric of the selected short videos of excellent popular science journals. The findings indicate that in content presentation, elaboration and case presentation are the dominant paradigms in short videos of popular science journals, while there is an obvious disconnect in data visualization presentation. The principles of scientific knowledge, often abstract and complex, require clear visual representation through graphs, charts and other means. Without such visualization, it is difficult for ordinary viewers to grasp core ideas or discover the laws hidden behind the data. Regarding narrative design, although the technical advantages of dynamic graphics and special effects are fully released, the immersive design of interface architecture and interactive elements fail to establish a comprehensive narrative system. The lack of interactive design can easily block the bidirectional communication mechanism of science communication. The absence of interactive functions in short videos, such as click-triggered dynamic buttons and adjustable experiments, diminished the audience's willingness to actively explore the principles of science. Regarding sensory guidance, metaphorical and symbolic symbols build a three-dimensional cognitive network, while the lack of emotionally guided symbols leads to a significant cognitive empathy gap. The presentation of short videos was originally intended to bring scientific knowledge closer to the audience. However, if the audience lacks emotional resonance in the process, the mechanical mode of knowledge instillation can neither arouse the audience's curiosity in scientific exploration nor establish a meaningful connection between scientific topics and individual life, which ultimately leads to the reduction of scientific communication to a one-way delivery of information rather than meaningful transmission of ideas and values. Based on these findings, this paper recommends first strengthening content presentation and logical structure, while enhancing visual appeal by translating the published knowledge. To address insufficient scientific and technological knowledge data visualization in content presentation, dynamic charts (such as line graphs, heat maps, 3D models) can be used to replace static charts, disassembling complex data into multi-layer infographics, thereby lowering the threshold of comprehension through step-by-step animation displays from macroscopic to microcosmic, and from holistic to localized, so as to enhance the visual attractiveness of the short videos of popular science journals. Second, the emotional resonance of narrative design should be strategically utilized to strengthen the empathetic connection between the content and the audience. Using engaging storytelling, scientific knowledge is integrated into the plot with emotional color, so that the audience can naturally accept scientific information while becoming immersed in the narrative experience. Third, it needs to optimize sensory guidance and digital narrative, and refine content to stimulate communication potential. Using vivid images, warm or inspiring color schemes, and infectious character expressions or actions, abstract scientific concepts can be made more relevant and easy to perceive.
The rapid advancement of information technology has catalyzed transformative shifts in cultural consumption patterns while generating increased demands for innovative publishing products. As digital-intellectual technologies permeate socioeconomic spheres, revolutionary transformations are reshaping knowledge dissemination channels and consumption modalities, with cross-industry integration particularly driving the publishing industry's transition from traditional content production toward integrated ecosystem models. In this context, conventional publishing enterprises are actively embracing emerging technologies, fostering novel business models, and expanding into new domains to accelerate the development of integrated publishing and adapt to technological disruptions. Nevertheless, substantial uncertainties remain in business model reconstruction during this integrated process, with key challenges including ambiguous market positioning, undifferentiated value propositions, superficial knowledge service implementations, slow channel structure optimization, continuing decline in traditional revenue streams, and limited growth in emerging digital ventures, thus collectively hindering deep-level integrated advancement. Through case study, while emphasizing knowledge services as the strategic core and business model reconstruction as the operational foundation, this paper conducts a detailed analysis of the foundations, conditions, channels, and goals of knowledge services in the context of integrated publishing, thereby proposing innovative business model strategies built on user-centric thinking, digital intelligence, multi-dimensional networks, and value realization. Based on practical experience in integration development, this paper investigates the challenges and root causes of integrated publishing from a supply-side perspective. It examines the value propositions and business models of integrated publishing through the underlying logic of knowledge services. This paper explores pathways for knowledge service implementation via publishing extensions, establishes a closed-loop value system integrating products, users, technology, and networks, and constructs a business model enabling enterprise value monetization. The analysis examines critical dimensions of integration-era business models: foundational infrastructures, enabling preconditions, distribution channels, and value creation objectives, culminating in a business model innovation framework structured upon user-centric design, digital-intellectual technology integration, multi-dimensional network architecture, and value realization mechanisms. Building upon empirical integration practices and closed-loop value chains integrating products, users, technologies, and networks, the proposed model enables corporate value realization by proposing that firms leverage intellectual property operations to evolve into cultural ecosystem hubs and knowledge service platforms. By targeting niche sectors, companies can build vertically integrated channel systems for multi-tiered, multi-format value monetization. They may also pursue cross-industry horizontal integration to transcend a singular publishing perspective, ensuring coordinated development across products, projects, and industries. Furthermore, firms can enhance user operation value through platforms where traffic, knowledge, and users co-create, achieving the "critical leap" from service to payment. These initiatives collectively construct an advanced, sustainable integrated publishing business model-boosting profitability, thereby strengthening core competitiveness, and fulfilling high-quality development objectives.
With the rapid penetration of generative artificial intelligence (AI) into the publishing industry, publishing marketing, as a key link between production and consumption, is facing a key shift from experience-driven to mechanism-driven. Starting from the highly practical publishing proposition of best-selling experience, this study adopted a questionnaire survey (questionnaires were distributed in the publishing industry community and related WeChat official account online from late July to August 6, 2025, and finally, 1049 valid questionnaires were collected), in-depth interviews with experts (respondents covered state-owned publishing groups, professional publishing houses, private publishing enterprises and technical services and platforms, respondents have worked in their respective fields for a long time, and first-line practical and management experience), case comparison and literature analyses, in-depth analyses of the application status of AI-generated content (AIGC) technology in China’s publishing industry, and made clear the specific links and application scenarios of AIGC-enabled publishing marketing. It is found that AI technology not only releases productivity at the efficiency end but also reshapes value generation at the levels of narrative organization, audience matching, immersion translation and organizational collaboration. In this study, best-selling experience is divided into three levels of knowledge practice: one is individual experience with editing as the core, which covers subjective cognitive resources such as topic selection judgment, text value judgment and border control; the second is the organizational experience focusing on the organization and brand level, which is reflected in the reusable template accumulated by the publishing category, brand style and audience stickiness; and the third is the industry experience embedded in the industry system logic, which emphasizes that published products have dual values of cultural attributes and commodity attributes and their public expectations should differ from those of ordinary consumer products, but they cannot be separated from the platform economy model and internet development logic constructed by data algorithms. This study emphasizes that the triple paths of individual experience, brand experience and industry experience are linked with each other, forming a continuous pedigree from individual skills and organizational ability to institutional environment, which gradually transforms occasional explosion experience into replicable long-term ability. In technological change, best selling is no longer the natural occurrence of results but rather a generation process composed of strategic design, technical ability and institutional arrangement. Experience is no longer an unspeakable intuitive judgment but a professional system that can be verified, transformed and passed on. Finally, copying experience involves copying the working methods and verification devices facing uncertainty so that experience can continuously generate new explanatory power and action forces on the premise of respecting category differences, channel heterogeneity and timing constraints. This study not only responds to the transformation needs of the publishing industry against the background of AIGC but also aims to clarify the redefinition of content logic, value negotiation and system construction after AI technology is involved in publishing marketing and provides theoretical support and practical reference for building a new data-driven publishing paradigm.
Against the backdrop of deep integration between digital technology and the cultural industry, the publishing sector is undergoing a fundamental transformation from “paper and ink” to “digitalization and smart technologies.” Integrated publishing has emerged as the primary pathway for high-quality development in the industry, with talent cultivation being pivotal to driving this transformation. The publishing industry currently faces multiple challenges in talent development, including rapidly evolving competency demands, outdated training systems, and intense competition for skilled professionals. This paper adopts a case-based research approach, examining Beijing Waiyan Online Digital Technology Co., Ltd. (UNIPUS), a wholly-owned subsidiary of the Foreign Language Teaching and Research Press (FLTRP), to investigate how talent strategies can effectively support digital transformation in publishing. Based on Unipus’s decade-long practical experience, this paper systematically analyzes the competency requirements and training challenges for integrated publishing professionals. The research identifies and elaborates on the “Five-LI” talent model, comprising “YuanLi” (willpower and intrinsic motivation), “NengLi” (capabilities and multidisciplinary expertise), “YueLi” (practical experience and industry insight), “QianLi” (potential for growth and adaptation), and “XinLi” (resilience and mental strength). This model provides a comprehensive framework defining the core attributes necessary for success in the evolving publishing landscape. Based on this model, the study further establishes six core systems and details their practical application. The Position Coordinate System clarifies job values and development pathways through precise positioning, grading, and channel design, ensuring alignment between individual roles and organizational strategy. The All-round Talent Map visualizes talent capabilities and identifies gaps through OKR-based performance management and systematic talent review, enabling data-driven decision-making. The Training and Growth System encompasses the entire employee lifecycle, from structured onboarding programs including “one-on-one” mentorship to leadership development initiatives such as the “M0 Management Reserve” program, fostering continuous learning and capability building. The Quantitative Evaluation Mechanism integrates performance outcomes, capability assessments, and values-based behavior metrics, providing a comprehensive and objective basis for talent appraisal. The Incentive and Feedback System connects material rewards, spiritual recognition, and career development opportunities, establishing a motivating environment that retains and engages top talent. The Cultural Nurturing Program converts abstract corporate values into tangible behaviors through employee co-creation and high-frequency cultural immersion, fostering a supportive and adaptive organizational culture. These systems collectively form a robust, dynamic, and culturally-grounded talent development framework that is strategically aligned, role-based, data-supported, and scalable. This paper highlights how UNIPUS’s practices have addressed internal talent development challenges while contributing to building new productive forces capable of driving innovation and growth. This paper provides valuable insights and actionable recommendations for the publishing industry, particularly for organizations navigating digital transformation and cultivating a new generation of integrated publishing professionals. The findings indicate that a systematic and holistic approach to talent management—combining clear positioning, continuous assessment, targeted development, and cultural embedding—can effectively overcome existing barriers and support sustainable industry development. The case of UNIPUS provides a replicable model for other enterprises seeking to enhance their talent systems and achieve high-quality growth in the era of integrated publishing.