Category Archives: JAVA

Error reported when Java connects mongodb com.mongodb.mongosocketopenexception: exception opening socket has been solved

Using idea to write java connection mongodb to report an error, let’s take a look at the detailed error information first

Solutions

Editor

vim mongodb/conf/mongod.conf

Add

bind_ip=0.0.0.0

After modification, the configuration file is as follows:

#Specify the start port
port=27017
#Specify the data storage directory
dbpath=data/db
#Specify the log storage directory
logpath=log/mongodb.log
#Run in the background
fork=true
#Log in with any IP
bind_ip=0.0.0.0

Causes of error information

Since mongodb version 3.6, it has_ The value of IP is localhost by default, so you can only use localhost to log in

Just change its IP to 0.0.0.0, you can use any IP login

[Solved] java.lang.NoClassDefFoundError: com/sun/image/codec/jpeg/ImageFormatException (Upload Images Error)

1. Problem phenomenon

upload image function, local test compilation needs to refer to the plug-in, otherwise the compilation does not pass.

<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>utf-8</encoding>
<compilerArguments>
<verbose />
<bootclasspath>${java.home}/lib/rt.jar:${java.home}/lib/jce.jar</bootclasspath>
<!–<bootclasspath>${java.home}\lib\rt.jar;${java.home}\lib\jce.jar</bootclasspath>–>
</compilerArguments>
</configuration>

</plugin>
</plugins>
</build>
Then the local test passed, but the release to the test environment reported an error.org.springframework.web.util.NestedServletException: Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: com/sun/image/codec/jpeg/ImageFormatException

2. Reason

upload compressed images using the contents of the com.sun package, but the test environment with java openjdk, lib below the rt.jar without com/sun/image/codec/jpeg package, resulting in references to this class

3. Solution: Install the standard jdk

cd /etc

vim profile

export JAVA_HOME=/usr/local/jdk8
export CLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar
export PATH=$JAVA_HOME/bin:$HOME/bin:$HOME/.local/bin:$PATH

source profile

java -version

[Solved] exception is java.lang.NoClassDefFoundError: com.sun.crypto.provider.SunJCE

When calling the 3DES encryption and decryption algorithm provided by sun company in java, you need to use $JAVA_HOME/jre/lib The 4 jar packages in the / directory are as follows:
jce.jar
security/US_export_policy.jar
security/local_policy.jar
ext/sunjce_provider.jar
These packages are automatically loaded when Java is running, so applications with main function do not need to be set to CLASSPATH Environment variables. For WEB applications, these packages need to be added to the WEB-INF/lib directory.
The following is a sample code for calling the 3DES encryption and decryption algorithm provided by Sun in java:
/*String DESede(3DES) encryption*/
import java.security.*;
import javax.crypto.*;
import javax.crypto.spec.SecretKeySpec;
public class ThreeDes {
private static final String Algorithm = “DESede”; //Define the encryption algorithm, available DES, DESede, Blowfish
//keybyte is the encryption key, The length is 24 bytes
//src is the encrypted data buffer (source)
public static byte[] encryptMode(byte[] keybyte, byte[] src) {
try {
//Generate key
SecretKey deskey = new SecretKeySpec(keybyte , Algorithm);
//Encrypted
Cipher c1 = Cipher.getInstance(Algorithm);
c1.init(Cipher.ENCRYPT_MODE, deskey);
return c1.doFinal(src);
}
catch (java.security.NoSuchAlgorithmException e1) {
e1.printStackTrace ();
}
catch (javax.crypto.NoSuchPaddingException e2) {
e2.printStackTrace();
}
catch (java.lang.Exception e3) {
e3.printStackTrace();
}
return null;
}
//keybyte is the encryption key, the length is 24 words Section
// src is the encrypted buffer
public static byte[] decryptMode(byte[] keybyte, byte[] src) {
try {
//Generate key
SecretKey deskey = new SecretKeySpec(keybyte, Algorithm);
//Decrypt
Cipher c1 = Cipher.getInstance(Algorithm);
c1.init(Cipher.DECRYPT_MODE, deskey);
return c1.doFinal(src);
}
catch (java.security.NoSuchAlgorithmException e1) {
e1.printStackTrace();
}
catch (javax.crypto.NoSuchPaddingException e2) {
e2.printStackTrace();
}
catch (java.lang.Exception e3) {
e3.printStackTrace();
}
return null;
}
//Convert into a hexadecimal string
public static String byte2hex(byte[] b) {
String hs=””;
String stmp=””;
for (int n=0;n<b.length;n++) {
stmp=(java.lang.Integer.toHexString(b[ n] &
0XFF ));  if (stmp.length()==1) hs=hs+”0″+stmp;
else hs=hs+stmp;
if (n<b.length-1) hs=hs+”:” ;
}
return hs.toUpperCase();
}
public static void main(String[] args){
//Add a new security algorithm, if you use JCE, add it
Security.addProvider(new com.sun.crypto.provider.SunJCE());
final byte[] keyBytes = {0x11, 0x22, 0x4F, 0x58,
(byte)0x88, 0x10, 0x40, 0x38, 0x28, 0x25, 0x79, 0x51,
(byte)0xCB, (byte)0xDD, 0x55, 0x66, 0x77, 0x29, 0x74,
(byte)0x98, 0x30, 0x40, 0x36, (byte)0xE2
); //24-byte key
String szSrc = “This is a 3DES test. Test”;
System.out.println(“String before encryption:” + szSrc);
byte[] encoded = encryptMode(keyBytes, szSrc.getBytes());
System.out.println( “Encrypted string:” + new String(encoded));
byte[] srcBytes = decryptMode(keyBytes, encoded);
System.out.println(“Decrypted string:” + (new String(srcBytes)) );
}
}

Docker port is already allocated [How to Solve]

ps -aux | grep -v grep | grep docker-proxy The second column is the process number

Stop the doker process, delete all containers, then delete the file local-kv.db, and then start docker.

sudo service docker stop
docker rm $(docker ps -aq)
sudo rm /var/lib/docker/network/files/local-kv.db
sudo service docker start

 

But the docker of the entire machine will stop and some parts need to be started manually

[Solved] IDEA JDK is 1.8 Warning: Diamond types are not supported at this language level

help——about View the version of ide.

IntelliJ IDEA 2017.3.7 (Ultimate Edition)
Build #IU-173.4710.11, built on April 4, 2019
Licensed to Rover12421/Rover12421
You have a perpetual fallback license for this version
Subscription is active until December 31, 2099
JRE: 1.8.0_152-release-1024-b18 amd64
JVM: OpenJDK 64-Bit Server VM by JetBrains s.r.o
Windows 7 6.1

A newly created maven project, created by clicking next, resulted in the following error when newKafkaProducer<>.

Diamond types are not supported at this language level

You can see that Diamond is a feature of JDK1.7, so 1.5 is definitely not supported.

Right click on the project and select Open Module Setting

Open the following pop-up window, the error is that the default Language level here is still 5.0

Maven Error compiling the project: diamond operator is not supported in -source 1.5

< profile>

< id> jdk-1.7</ id>

< activation>

< activeByDefault> true</ activeByDefault>

< jdk> 1.7</ jdk>

</ activation>

< properties>

< maven.compiler.source> 1.7</ maven.compiler.source>

< maven.compiler.target> 1.7</ maven.compiler.target>

< maven.compiler.compilerVersion> 1.7</ maven.compiler.compilerVersion>

</ properties>

</ profile>

[Solved] Error: JAVA_HOME is not set and could not be found.

Error: JAVA_HOME is not set and could not be found.

Problem:

Centos7 Install CDH Error: Error: JAVA_HOME is not set and could not be found.

+ local name=log4j.properties
+ '[' '!' -f /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf/log4j.properties ']'
+ mv /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/hbase-conf/log4j.properties /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf
+ for i in '"$HBASE_CONF_DIR"/*'
++ basename /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/hbase-conf/ssl-client.xml
+ local name=ssl-client.xml
+ '[' '!' -f /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf/ssl-client.xml ']'
++ get_default_fs /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf
++ get_hadoop_conf /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf fs.defaultFS
++ local conf=/opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf
++ local key=fs.defaultFS
++ '[' 1 == 1 ']'
++ /opt/cloudera/parcels/CDH-5.14.4-1.cdh5.14.4.p0.3/lib/hadoop/../../bin/hdfs --config /opt/cloudera-manager/cm-5.14.1/run/cloudera-scm-agent/process/ccdeploy_spark-conf_etcsparkconf.cloudera.spark_on_yarn_6447222260596468198/spark-conf/yarn-conf getconf -confKey fs.defaultFS
Error: JAVA_HOME is not set and could not be found.
+ DEFAULT_FS=

Solution:

Use rpm Install jdk

Download jdk-8u201-linux-x64.rpm

chmod 755 jdk-8u201-linux-x64.rpm
rpm -i jdk-8u201-linux-x64.rpm

configure environment variables:

export JAVA_HOME=/usr/java/jdk1.8.0_201-amd64
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/jre/lib/rt.jar
export PATH=$PATH:$JAVA_HOME/bin

JAVA_HOME is not defined correctly [How to Solve]

It’s a magic question. The system was running, and suddenly it hung up. Various Java packages are missing

1. Check maven configuration.bash_ profile

2. Check the run call file. Mavenrc 2

Run Java – version, no problem, normal

Run MVN – version, oh

Error: JAVA_HOME is not defined correctly.
  We cannot execute Library/Java/JavaVirtualMachines/jdk1.8.0_101.jdk/Contents/Home/bin/java

And then you start to flip through the configuration file

##### Explain the two jdk versions 7 and 8 on your machine, this command determines your Java version
echo $JAVA_HOME
##### is normally 8
Library/Java/JavaVirtualMachines/jdk1.8.0_101.jdk/Contents/Home

##### check if maven is missing also normal
which mvn
/Users/****/Documents/maven/apache-maven-3.3.9/bin/mvn

##### started checking the configuration files and they were fine
vim ~/.bash_profile

Checked PATH=$PATH:$MAVEN_HOME/bin and it's not missing (maven configuration at the end of the article)

I started to feel depressed, so I checked another file called by Maven runtime

##### into the root home
cd $HOME

##### Find the file
ls -a

##### No .mavenrc found

##### depressed 。。。。

##### hit directly
vim .mavenrc

##### and type in
JAVA_HOME=$(/usr/libexec/java_home)

##### save
ESC ---- :wq! 

##### Check
mvn -version

##### normal
Apache Maven 3.3.9

Error: JAVA_HOME is not defined correctly. We cannot execute Library/Java/JavaVirtualMachines/jdk1.8.0_101.jdk/Contents/Home/bin/java

[Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace

In the process of integrating spark with HBase, we encounter the following problems:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1053)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:938)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97)
  ... 47 elided
Caused by: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/htrace/Trace when creating Hive client using classpath:......

Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:270)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
  at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
  at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
  at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
  ... 61 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.NoClassDefFoundError: org/htrace/Trace
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
  ... 78 more
Caused by: java.lang.NoClassDefFoundError: org/htrace/Trace
  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:214)
  at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1977)
  at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
  at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
  at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
  at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
  at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:596)
  at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
  at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
  at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191)
  ... 83 more
Caused by: java.lang.ClassNotFoundException: org.htrace.Trace
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 103 more
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^
Welcome to
      ____              __
    /__/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/
         
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

[Solved] Error occurred during initialization of boot layer

[Exception description]

Error occurred during initialization of boot layer
java.lang.module.FindException: Error reading module: E:\Users\Administrator\eclipse-workspace\Example0\bin
Caused by: java.lang.module.InvalidModuleDescriptorException: helloMyJava.class found in top-level directory (unnamed package not allowed in module)

JDK version: 10.0.2

 

Method 1: Create a new package, drag the class files in the default package into the new package, refresh, the default package will disappear

 

Method 2: Delete module-info.java