descriptive adequacy assessment methodology simi-
lar to that presented by (Gomez-Marco, 2015) func-
tions as a suitable benchmark to assess the correct-
ness of the HASPNeL parsing model. Expert evalu-
ators will be given a number of sentences with their
corresponding structural representations produced by
the system. Each evaluator assesses syntactic crite-
ria per sentence, such as, (1) the clause’s immediate
constituents, (2) each constituent’s internal structure,
(3) argument structure, (4) identification of categories
and projections, and (5) detection of structural ambi-
guities by representations that succeed in criteria 1-4.
Assessment of each criterion can be Boolean or using
a scale. Evaluators may write comments about their
judgements and observations, which shall be used to
fix bugs in the system’s theory modeling.
Since we are working with a synthetic corpus of
a manageable size, at a later stage of the project, we
may be able to measure by hand the cases of lexical
ambiguity in the annotated synthetic corpus and cal-
culate the likelihood of structural ambiguity in sen-
tences with those lexical units that are ambiguous
with respect to the corpus. To assess the system’s am-
biguity estimation, these measurements may be com-
pared with both the results of the system by detect-
ing possible ambiguity and the likelihood estimates
of each interpretation in those cases.
6 CONCLUSIONS
Although machine learning systems have the advan-
tage of a relatively fast and easier training, they fail
to acquire the capacity to detect structural ambigu-
ity that gives rise to semantic ambiguity. Symbolic
systems, on the other hand, do account for structural
ambiguities and are suitable for the construction of a
knowledge base as a model of human language cog-
nition. The system we propose exploits the advan-
tages of both strategies, as current literature suggests
that NLP implementations are improved by combin-
ing resources from both probabilistic and symbolic
AI to perform the specific tasks to which they are
best. Syntactic formalisms of minimalist grammars
and tree-adjoining grammars will be implemented in
the system, which can be used as a computational
model of language knowledge and acquisition, as well
as to test current syntactic theory. This system may
also serve as foundation to applications in education,
text editing, and the development of other human lan-
guage technologies, particularly for underrepresented
languages which cannot benefit from big data ap-
proaches.
ACKNOWLEDGEMENTS
This material is based upon work supported by the
National Science Foundation (NSF) under Grant No.
2219712 and 2219713. Any opinions, findings, and
conclusions or recommendations expressed in this
material are those of the authors and do not neces-
sarily reflect the views of the NSF.
REFERENCES
Aggarwal, C. C. (2018). Neural Networks and Deep Learn-
ing. Springer.
Alers-Valent
´
ın, H., Rivera-Vel
´
azquez, C. G., Vega-Riveros,
J. F., and Santiago, N. G. (2019). Towards a princi-
pled computational system of syntactic ambiguity de-
tection and representation. In Proceedings of the 11th
International Conference on Agents and Artificial In-
telligence - NLPinAI,, volume 2, pages 980–987. IN-
STICC, SciTePress.
Bernardy, J.-P. and Chatzikyriakidis, S. (2019). What kind
of natural language inference are nlp systems learn-
ing: Is this enough? In Proceedings of the 11th In-
ternational Conference on Agents and Artificial In-
telligence - NLPinAI,, volume 2, pages 919–931. IN-
STICC, SciTePress.
Bikel, D. M. (2004). Intricacies of collins’ parsing model.
Computational Linguistics, 30:479–511.
Brown, T. B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J.,
Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G.,
Askell, A., Agarwal, S., Herbert-Voss, A., Krueger,
G., Henighan, T., Child, R., Ramesh, A., Ziegler,
D. M., Wu, J., Winter, C., Hesse, C., Chen, M., Sigler,
E., Litwin, M., Gray, S., Chess, B., Clark, J., Berner,
C., McCandlish, S., Radford, A., Sutskever, I., and
Amodei, D. (2020). Language models are few-shot
learners.
Cho, K., van Merri
¨
enboer, B., Bahdanau, D., and Bengio, Y.
(2014). On the properties of neural machine transla-
tion: Encoder–decoder approaches. In Proceedings of
SSST-8, Eighth Workshop on Syntax, Semantics and
Structure in Statistical Translation, pages 103–111,
Doha, Qatar. ACL.
Chomsky, N. (1956). Three models for the description of
language. IRE Transactions on Information Theory,
2:113–124.
Chomsky, N. (1981). Lectures on Government and Binding.
Number 9 in Studies in generative grammar. Foris,
Dordrecht.
Chomsky, N. (1995). The minimalist program. MIT Press.
Chomsky, N. (2001). Derivation by phase (mitopl 18). In
Ken Hale: A Life is Language, pages 1–52. MIT Press.
Chomsky, N. (2008). On phases. In Foundational Issues
in Linguistic Theory: Essays in Honor of Jean-Roger
Vergnaud, pages 133–166. MIT Press.
Chomsky, N. (2021). Minimalism: Where are we now, and
where can we hope to go. Gengo Kenkyu, 160:1–41.
Modeling Syntactic Knowledge With Neuro-Symbolic Computation
615