Category Archives: Error

[Solved] Run Error without DevExpress installed: system IO. FileNotFoundException exception, error module name: kernelbase.dll

Resolution process:


1: Because all referenced DLLs are not copied locally, all referenced DLLs have been checked to ensure that they are included in the input directory. The operation fails and the error content is the same
2: if the reason for x86 or x64 editing is not specified, x86 and x64 versions are generated respectively. The operation fails and the error content is the same
3: after installing the devaxpress control on the test computer, it runs normally.

Solution:


The reason is that the DLL is not copied completely. Open your VS project. There is a DevExpress Assembly Deployment Tool menu in the “Tools” of the menu bar. With this tool, you can export the DEV-related dlls used in the current project.

There is a special note. If you use the icons in Dev, please put DevExpress.Images.v{version number}.dll in your project.

[Solved] Gradle Error: Could not resolve all dependencies for configuration ‘:detachedConfiguration7

Change the depot of gradle.build error:

Could not resolve all dependencies for configuration ':detachedConfiguration7'.
Using insecure protocols with repositories, without explicit opt-in, is unsupported. Switch Maven repository 'maven(http://maven.aliyun.com/nexus/content/groups/public/)' to redirect to a secure protocol (like HTTPS) or allow insecure protocols. See https://docs.gradle.org/7.0.2/dsl/org.gradle.api.artifacts.repositories.UrlArtifactRepository.html#org.gradle.api.artifacts.repositories.UrlArtifactRepository:allowInsecureProtocol for more details.

 

Solution:
Method 1: Add keywords before the depot:

allowInsecureProtocol = true
plugins {
id 'org.springframework.boot' version '2.5.2'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
}

group = 'com.example'
version = '1.0.0'
sourceCompatibility = '1.8'

repositories {
// mavenCentral()
maven{
allowInsecureProtocol = true
url 'http://maven.aliyun.com/nexus/content/groups/public/'}
}

dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
testImplementation 'org.springframework.boot:spring-boot-starter-test'

implementation 'org.apache.logging.log4j:log4j-core:2.14.1'
}

test {
useJUnitPlatform()
}

Method 2: Change the http connection to https:

plugins {
id 'org.springframework.boot' version '2.5.2'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id 'java'
}

group = 'com.example'
version = '1.0.0'
sourceCompatibility = '1.8'

repositories {

maven{
url 'https://maven.aliyun.com/nexus/content/groups/public/'}
}

dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
testImplementation 'org.springframework.boot:spring-boot-starter-test'

implementation 'org.apache.logging.log4j:log4j-core:2.14.1'
}

test {
useJUnitPlatform()
}

[Solved] IDEA Remote Operate hdfs Hadoop Error: Caused by: java.net.ConnectException: Connection refused: no further information

DEA remote operation HDFS Hadoop

Idea remote operation HDFS Hadoop, error caused by: Java net. ConnectException: Connection refused: no further information

1.source code

package com.github.td.hdfs;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.Test;

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

/**
 * @Description: TODO
 * @Author: 
 * @DateTime: 2021/12/20 17:52
 **/
public class HdfsClient {

    @Test
    public void testMkdirs() throws IOException, URISyntaxException, InterruptedException{

        // 1 Get the system of the files
        Configuration configuration = new Configuration();
        // master == 192.168.xx.xx
        // the port of cdh version is 8020, the port of common version Hadoop is 9000.
        FileSystem fs = FileSystem.get(new URI("hdfs://master:8020"), configuration, "root");

        // 2 Create Directory
        fs.mkdirs(new Path("/hadoop/hdfs-api"));

        // 3 Close resouce
        fs.close();
    }
}

2. Error report log

java.net.ConnectException: Call From to master:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755)
	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549)
	at org.apache.hadoop.ipc.Client.call(Client.java:1491)
	at org.apache.hadoop.ipc.Client.call(Client.java:1388)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
	at com.sun.proxy.$Proxy12.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:657)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
	at com.sun.proxy.$Proxy13.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2420)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2396)
	at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1319)
	at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1316)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1333)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1308)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2275)
	at com.github.td.hdfs.HdfsClient.testMkdirs(HdfsClient.java:29)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
Caused by: java.net.ConnectException: Connection refused: no further information
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:700)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:804)
	at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:421)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1606)
	at org.apache.hadoop.ipc.Client.call(Client.java:1435)
	... 46 more

3. cause analysis

Port problem

In the CDH environment, HDFS is port 8020, conf.set (“FS. Defaultfs”“ hdfs://192.168.0.4:8020 ”);

In a normal Hadoop environment, HDFS is a 9000 port, conf.set (“FS. Defaultfs”“ hdfs://192.168.0.121:9000 ”);

4. Solutions

Modify the port number to 9000

[Solved] PBI Open Error: WebView2 Process failed: explorationBrowser, Source:https://ms-pbi.pbi.microsoft.com/pbi/Web/Views/reportView.hrm

Problem description: An error was reported when opening power bi desktop.
WebView2 Process failed: explorationBrowser, Source:https://ms-pbi.pbi.microsoft.com/pbi/Web/Views/reportView.hrm
This is related to the browser Edge. Open Edge and find an error: An incompatible piece of software attempted to load along with Microsoft Edge. This can be caused by malware, though it’s usually caused by a program that is out-of-date. We recommend making sure you have the latest version of that program installed, and that your antimalware software is running and up-to-date.
Solving the issue of Edge is OK.

Ref: WebView2 uses Microsoft Edge as a rendering engine to display web-based features in a desktop application.

[Solved] Solr Error: org.apache.solr.common.SolrException: undefined field text

Modify solrconfig XML file

vim /solr/configsets/Your_collection_name/conf/solrconfig.xml

Find this paragraph:

<listener event="firstSearcher" class="solr.QuerySenderListener">
  <arr name="queries">
    <lst>
      <str name="q">static firstSearcher warming in solrconfig.xml</str>
    </lst>
  </arr>
</listener>

Modify <str name=”q”></str> The value of is *: *, which is modified as follows:

<listener event="firstSearcher" class="solr.QuerySenderListener">
  <arr name="queries">
    <lst>
      <str name="q">*:*</str>
    </lst>
  </arr>
</listener>

If zookeeper is used, you need to upload the configuration to zookeeper again:

/solr/server/scripts/cloud-scripts/zkcli.sh -cmd upconfig -zkhost Address 1:2181, Address 2:2181, Address 3:2181 -confdir /solr/configsets/your collection name/conf -confname your configuration name

Re create a new collection

[Solved] undefined reference to `cv::imread(std::string const&, int)’

problem

Error message: undefined reference to ` CV:: imread (STD:: String const & amp;, int) ‘

When the above error occurs, I thought there was a problem with the opencv link, so I have been trying to solve the problem of importing opencv and trying to link the library in various ways, but the error will always appear in the end

Finally, it was found that the problem was C + + abi

Solution

In the code #define_GLIBCXX_USE_CXX11_ABI 0 causes this error. If it is removed, it can run normally

reason

#define _GLIBCXX_USE_CXX11_ABI 0 is an old version of ABI used when compiling links

In libstdc + + (c + + standard library under GCC) released in GCC version 5.1, a new implementation of STD:: basic_string is added. The new implementation coexists with the old implementation, but has different names. The new implementation is called STD:: _cxx11:: basic_string, and the old one is called STD:: basic_string.

The newer version of GCC will compile STD:: string as STD:: string in C + + 11:__cxx11::basic_string< char> At this time, if the third-party library you call does not enable the C + + 11 feature during compilation, the STD:: string in it is actually STD:: basic_string< char>, The two cannot be converted to each other# define _GLIBCXX_USE_CXX11_ABI 0 is to make STD:: string in your code compile according to the old implementation

But my opencv should have been compiled with STD:: basic_ String, so it doesn’t need to be set. If it is set, the error shown in the title will appear instead

However, if the third-party library uses an old implementation and its compiler is a new version, you need to add #define_GLIBCXX_USE_CXX11_ ABI 0, otherwise undefined reference to ` CV:: imread (STD:: _cxx11:: basic_string <char, STD:: char_traits, STD:: allocator > const & int) ‘will appear

 

[Solved] Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

Problem Description:

An error is reported when using MapReduce to implement data Deduplication

Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

In addition, there are: Java.lang.NoClassDefFoundError; java.io.IOException: Cannot initialize Cluster.

Problem-solving:

These problems are due to the incomplete import of MapReduce and Hadoop dependencies:

<dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.17</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>3.2.0</version>
    </dependency>

    <!--mapreduce-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>3.2.0</version>
    </dependency>

So far, the problem has been solved

[Solved] Failed to configure a DataSource: ‘url’ attribute is not specified and no embedded datasource could be configured.

In the spring boot project, I deleted a line of code because I was in debt. It took 2-3 hours to solve the problem. It is hereby recorded.

        <!--The following resources are added later to specify the package file -->
         <resources>
             <resource>
                 <directory>src/main/resources</directory>
                 <!-- The resource root directory excludes the configuration of each environment to prevent redundant directories in the generated directory -->
                <filtering>true</filtering>
                <excludes>
                    <exclude>application*.yml</exclude>
                </excludes>
            </resource>
            <resource>
                <directory>src/main/resources</directory>
                <filtering>true</filtering>
                <includes>
                    <include>application.yml</include>
                    <include>application-${profiles.active}.yml</include>
                </includes>
            </resource>
        </resources>

This <include> in application.yml It was deleted by me.