Author Archives: Robins

How to Solve idea Import ojdbc Pack Error

Download online or ask colleagues to pass two jars, ojdbc6-11.2 0.7. 0. Jar and jconn3 0.jar

Put in the same folder:

CMD in this folder is as follows: press enter to enter the CMD console.

Enter the following two commands:

mvn install:install-file -DgroupId=com.oracle -DartifactId=ojdbc6 -Dversion=11.2.0.3 -Dpackaging=jar -Dfile=ojdbc6-11.2.0.3.jar

mvn install:install-file -DgroupId=com.sybase -DartifactId=jconn -Dversion=3-6.0 -Dpackaging=jar -Dfile=jconn3-6.0.jar

Represents a successful addition. For build failure, please check whether the version number, groupid, file name, etc. in the statement are consistent with the actual situation, and then run it after modification.

Note: If you have installed multiple maven versions, please check which version the current maven_home address is. This command is to install the jar locally into our local maven warehouse. If the warehouse is wrong, it won’t work if you install it.

After installation, right-click the project and update project

Click OK, and then Maven install again. No error will be reported

[Solved] org.apache.catalina.startup.HostConfig.deployWAR Error deploying web application archive

At 23:00 yesterday, an error was reported after an old project was packaged and released to the cloud server. Tomcat9 couldn’t start, which wasted me several hours.

This error has been encountered before and has been forgotten for a long time. It is hereby recorded

Error log

22-Dec-2021 23:52:18.703 SEVERE [main] org.apache.catalina.startup.HostConfig.deployWAR Error deploying web application archive [/usr/tomcat/webapps/project.war]
    java.lang.IllegalStateException: Error starting child
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:731)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:700)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:696)
        at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1024)
        at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1911)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
        at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
        at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:825)
        at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:475)
        at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1618)
        at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:319)
        at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
        at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:423)
        at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:366)
        at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:948)
        at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:835)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
        at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1398)
        at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1388)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
        at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
        at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:921)
        at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:263)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
        at org.apache.catalina.core.StandardService.startInternal(StandardService.java:437)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
        at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:934)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
        at org.apache.catalina.startup.Catalina.start(Catalina.java:772)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:345)
        at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:476)
    Caused by: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/project]]
        at org.apache.catalina.util.LifecycleBase.handleSubClassException(LifecycleBase.java:440)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:198)
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:728)
        ... 37 more

process

My environment is centenos7 + tomcat9 + jdk8 + springboot2*

After reporting the error, I checked a lot of information on the Internet, and a variety of solutions emerge one after another

But it’s not egg, as usual nonsense.

On the Internet

1. It’s said to change Java_ A configuration related to random numbers under JRE/lib under home (changed, useless)

2. It’s said to change the Tomcat Version (never tried, trouble)

3. It’s said to change C. under Tomcat conf So and so configuration file (useless at first sight)

4. Some people say that the JDK is low (mine is 1.8, so it’s impossible)

5. It is said that the version of the spring * * * framework is inconsistent with the relevant environment (as long as it can run in the editor, it generally does not exist)

5. A reliable post: Tomcat/conf/server The item corresponding to the docbase attribute of the context node under the host node in XML is wrong (this is possible, but mine is OK)

Solution:

Empty the target folder, pack again, publish, success!

[Solved] IDEA 2021.3 Error: Error launching IDEA if you already have a 64-bit JDK installed,define a JAVA_HOME

In the process of activating the idea, just modify the vmoptions file and restart IntelliJ idea, the following error will be reported.

Solution:

In the directory: C:\Users\${your.username}\AppData\Roaming\JetBrains\IntelliJIdea2020.3 there is a file named idea.64.exe.vmoptions , just delete the changed file.

 

ps: I installed the JDK separately before installing IDEA, but this does not affect my normal installation of IDEA; after the above problems are solved, the normal activation steps can also be performed, and a vmoptions file will be regenerated.

[Solved] pip/pytest Error: Fatal error in launcher: Unable to create process using

After moving the python directory, execute the EXE program under Python \ scripts. The prompt is as follows

Fatal error in launcher: Unable to create process using ‘”e:\myidle\python3\python.exe” “F:\MyIDLE\Python3\Scripts\pytest.exe” ‘: ???????????
the reason is that after modifying the directory, the directory path in exe has not changed. Use notpad + + to open the EXE file, pull it to the bottom, and modify the path to the moved Python directory

[Solved] Gradle Error: Could not resolve all dependencies for configuration ‘:detachedConfiguration7

Change the depot of gradle.build error:

Could not resolve all dependencies for configuration ':detachedConfiguration7'.
Using insecure protocols with repositories, without explicit opt-in, is unsupported. Switch Maven repository 'maven(http://maven.aliyun.com/nexus/content/groups/public/)' to redirect to a secure protocol (like HTTPS) or allow insecure protocols. See https://docs.gradle.org/7.0.2/dsl/org.gradle.api.artifacts.repositories.UrlArtifactRepository.html#org.gradle.api.artifacts.repositories.UrlArtifactRepository:allowInsecureProtocol for more details.

 

Solution:
Method 1: Add keywords before the depot:

allowInsecureProtocol = true
plugins {
id 'org.springframework.boot' version '2.5.2'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
}

group = 'com.example'
version = '1.0.0'
sourceCompatibility = '1.8'

repositories {
// mavenCentral()
maven{
allowInsecureProtocol = true
url 'http://maven.aliyun.com/nexus/content/groups/public/'}
}

dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
testImplementation 'org.springframework.boot:spring-boot-starter-test'

implementation 'org.apache.logging.log4j:log4j-core:2.14.1'
}

test {
useJUnitPlatform()
}

Method 2: Change the http connection to https:

plugins {
id 'org.springframework.boot' version '2.5.2'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
}

group = 'com.example'
version = '1.0.0'
sourceCompatibility = '1.8'

repositories {

maven{
url 'https://maven.aliyun.com/nexus/content/groups/public/'}
}

dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
testImplementation 'org.springframework.boot:spring-boot-starter-test'

implementation 'org.apache.logging.log4j:log4j-core:2.14.1'
}

test {
useJUnitPlatform()
}

[Solved] IDEA Remote Operate hdfs Hadoop Error: Caused by: java.net.ConnectException: Connection refused: no further information

DEA remote operation HDFS Hadoop

Idea remote operation HDFS Hadoop, error caused by: Java net. ConnectException: Connection refused: no further information

1.source code

package com.github.td.hdfs;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.Test;

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

/**
 * @Description: TODO
 * @Author: 
 * @DateTime: 2021/12/20 17:52
 **/
public class HdfsClient {

    @Test
    public void testMkdirs() throws IOException, URISyntaxException, InterruptedException{

        // 1 Get the system of the files
        Configuration configuration = new Configuration();
        // master == 192.168.xx.xx
        // the port of cdh version is 8020, the port of common version Hadoop is 9000.
        FileSystem fs = FileSystem.get(new URI("hdfs://master:8020"), configuration, "root");

        // 2 Create Directory
        fs.mkdirs(new Path("/hadoop/hdfs-api"));

        // 3 Close resouce
        fs.close();
    }
}

2. Error report log

java.net.ConnectException: Call From to master:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755)
	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549)
	at org.apache.hadoop.ipc.Client.call(Client.java:1491)
	at org.apache.hadoop.ipc.Client.call(Client.java:1388)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
	at com.sun.proxy.$Proxy12.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:657)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
	at com.sun.proxy.$Proxy13.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2420)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2396)
	at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1319)
	at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1316)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1333)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1308)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2275)
	at com.github.td.hdfs.HdfsClient.testMkdirs(HdfsClient.java:29)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
Caused by: java.net.ConnectException: Connection refused: no further information
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:700)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:804)
	at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:421)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1606)
	at org.apache.hadoop.ipc.Client.call(Client.java:1435)
	... 46 more

3. cause analysis

Port problem

In the CDH environment, HDFS is port 8020, conf.set (“FS. Defaultfs”“ hdfs://192.168.0.4:8020 ”);

In a normal Hadoop environment, HDFS is a 9000 port, conf.set (“FS. Defaultfs”“ hdfs://192.168.0.121:9000 ”);

4. Solutions

Modify the port number to 9000