identifies the core challenges of current risk
management and prospects future research directions;
finally, the conclusions summarize the research
findings and emphasize the strategic value of
dynamic closed-loop systems in uncertain
environments.
2 RISK ASSESSMENT SYSTEM:
FROM EXPERT EXPERIENCE
TO DATA-DRIVEN
The core task of risk assessment is to answer two
questions: "Which risks need attention?" and "How
severe are the risks?" This paper proposes a "double-
layer filtering-multi-dimensional modeling-dynamic
calibration" assessment framework, combining
expert experience and algorithmic intelligence to
achieve a more comprehensive characterization of
risks.
2.1 Qualitative Assessment
In areas with insufficient data or emerging risks,
qualitative assessment remains crucial. The Delphi
method is a classic expert consensus method that
reduces group bias and improves prediction accuracy
through anonymous feedback and iterative
adjustments (Dalkey & Helmer, 1963). This method
originated from a 1960s technology forecasting
experiment by the RAND Corporation, where Dalkey
and Helmer (1963) first used it to simulate Soviet
strategic bombing assessments of U.S. industrial
targets. In the experiment, experts' initial estimated
range for the number of bombs was 50-5,000, which
converged to 167-360 after five rounds of anonymous
feedback, with the maximum-minimum ratio
dropping from 100:1 to 2:1, highlighting the role of
structured feedback mechanisms in curbing
groupthink (Dalkey & Helmer, 1963).
Modern enterprises often combine the Delphi
method with fuzzy mathematics. For example, XA
City Tobacco Company adopted a four-layer risk
assessment model (strategic, industry, operational,
compliance), and through three rounds of expert
opinion solicitation (15 cross-disciplinary experts
including policy researchers, risk analysts, and
corporate compliance officers), each round required
experts to score risk factors on a 1-10 scale, and fuzzy
analytic hierarchy process (FAHP) was used to
calculate weights, with the "tobacco control policy
risk" in the strategic layer weighing 0.32,
significantly higher than other factors, finally
establishing a three-level early warning mechanism
(high risk >0.6, medium risk 0.3-0.6, low risk <0.3)
(Cox, 2008). This method improved the enterprise's
resource allocation efficiency in high-impact risk
areas such as regulatory policy changes by 40%, but
requires mechanisms such as expert industry
experience spanning ≥ 5 years and transparent
feedback processes to reduce subjectivity (Dalkey &
Helmer, 1963).
The risk matrix is another commonly used tool for
prioritizing risks through two-dimensional
classification of "probability-impact". However,
traditional matrices ignore non-linear interactions
between risks, and Cox (2008) recommended adding
"recovery difficulty" as a third dimension (Cox,
2008). For example, an automobile manufacturer
positioned the "chip shortage" risk as (high
probability 0.8, high impact 0.9, high recovery
difficulty 0.7), triggering a level-one response
mechanism. By adding three new chip foundries
(including TSMC, Intel, and Samsung) and
increasing safety stock from 30 days to 120 days, the
risk of production line shutdowns was reduced from
45% to 12% (Cox, 2008).
2.2 Quantitative Assessment
Quantitative methods rely on data and algorithms to
provide more objective risk metrics. Value at Risk
(VaR) is a standard tool in the financial industry for
calculating maximum losses at specified confidence
levels through historical simulation or Monte Carlo
methods (Jorion, 2006). However, VaR exposed the
flaw of underestimating tail risks during the 2008
financial crisis, and Acerbi (2013) proposed the
Expected Shortfall (ES) model, which improves
prediction accuracy in extreme scenarios by 30% by
calculating the mean of tail losses (Acerbi, 2013).
In non-financial fields, the Adaptive Support
Vector Machine (ASVM) has performed
outstandingly. A study on GEM listed companies
found that ASVM, combining financial indicators
(current ratio, debt-to-asset ratio), market data
(volatility, market value-to-revenue ratio), and ESG
information (number of patents, carbon emission
intensity), uses a radial basis function (RBF) for non-
linear mapping and optimizes penalty parameter C
and kernel parameter γ through cross-validation,
improving default prediction accuracy to 89%, far
exceeding the 72% of traditional logistic regression
(Li, et al., 2022). A technology company embedded
ASVM into its supply chain system to real-time
monitor 12 indicators such as supplier delivery delay
rate (weight 30%), quality qualification rate (25%),