Studio Edition Version 12.2.1.4.0
I'm trying to integrate Stanford CoreNLP (specifically the 400MB+ stanford-corenlp-4.5.4-models.jar
) into an Oracle ADF application, but facing deployment challenges due to the JAR size. The main issues are:
- EAR file size becomes unmanageable (>500MB)
- Slow deployment to WebLogic servers
- Memory constraints during application startup
Current Approach:
- Using CoreNLP for text processing in managed beans
- Basic POS tagging and tokenization requirements
- Facing "JVM memory issues (sometimes)" when models aren't found
Specific Questions:
- What's the recommended way to handle large external JARs in ADF?
- Are there patterns for lazy-loading resource-intensive libraries?
- How to properly reference external model files from ADF applications?
- Has anyone successfully integrated Stanford NLP with ADF?