Author Archives: Robins

[Solved] gpg: keyserver receive failed: Invalid argument

Solution

Go to this website: http://keyserver.ubuntu.com/

Find the email address in the error message and enter the email address on this website to search

Search for these keys and click the first one

Click in to see the contents of the key

Copy all, create a text file on Linux, take any name, and copy the content

touch gpgKey
vim gpgKey
sudo apt-key add gpgKey

Then you can

Error reporting process

Before installing mysql, an error was reported when executing apt update:

Err:2 http://repo.mysql.com/apt/ubuntu bionic InRelease                                     
  The following signatures were invalid: EXPKEYSIG 8C718D3B5072E1F5 MySQL Release Engineering <[email protected]>

It is said on the Internet that this method:

apt-key adv  --keyserver hkp://keyserver.ubuntu.com --recv yourKey

The result is still not good, and an error is reported:

root@xx:/var/log/mysql# apt-key adv  --keyserver hkp://keyserver.ubuntu.com --recv 8C718D3B5072E1F5
Executing: /tmp/apt-key-gpghome.WnlsI9s8pX/gpg.1.sh --keyserver hkp://keyserver.ubuntu.com --recv 8C718D3B5072E1F5
gpg: keyserver receive failed: Invalid argument

[Solved] Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try.

Error message: failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try (Nodes: current=[DatanodeInfoWithStorage[192.168.13.130:50010,DS-d105d41c-49cc-48b9-8beb-28058c2a03f7,DISK]], original=[DatanodeInfoWithStorage[192.168.13.130:50010,DS-d105d41c-49cc-48b9-8beb-28058c2a03f7,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via ‘dfs. client. block. write. replace-datanode-on-failure. policy’ in its configuration

This error occurs because I append the local file to the txt file of HDFS. When I added the second time, the error message became

From the first error message, we can find dfs.client.block.write.replace-datanode-on-failure.policy.

So I went to check hadoop in etc/hdfs-site.xml. found that I did not define the number of copies, so in which to add. restart can be.

 <property>
             <name>dfs.client.block.write.replace-datanode-on-failure.policy</name>
             <value>NEVER</value>
        </property>

Analysis: By default, the number of copies is 3. When performing the write to HDFS operation, when one of my Datenodes fails to write, it has to keep the number of copies as 3, it will look for an available DateNode node to write, but there are only 3 on the pipeline, all resulting in the error Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try.

The following lines of code do not exist, then the number of copies default to 3. Refer to the official apache documentation that NEVER: never add a new datanode is equivalent to After setting to NEVER, no new DataNode will be added. Generally speaking, it is not recommended to turn on DataNode nodes less than or equal to 3 in a cluster.

<property>
                <name>dfs.replication</name>
                <value>3</value>
        </property>

 

[Solved] TypeError: Cannot read properties of undefined (reading ‘templateName’)

It encapsulates a component and never reports an error at the beginning. After adding data, it starts to report an error

TypeError: Cannot read properties of undefined (reading 'templateName')

When I add a new component, I always report an error, but when I modify it, I don’t report an error. I always thought it was caused by El-tab-pane . I thought that the dynamic rendering component had been debugged for a long time. I thought the problem was always in dynamic rendering. Later, I found that it was not this problem

The final problem is that my child component uses the data transmitted by the parent component, but the parent component does not transmit this data when adding

This leads to an error. Therefore, when adding a new component, the parent component passes in a null value, or determines whether to add or modify to control the assignment in the child component

[Solved] Ideal Integrate Mybatis.xml File Error: BindingException: Invalid bound statement (not found):

Screenshot of error reporting

Checked the corresponding namespace

Corresponding target directory

Next, the path of the configuration file is checked

When it is found that none of the four conventional methods can solve the problem, consider whether it is the
POM file screenshot caused by jar package conflict

Sure enough, I use mybatis plus, but I use the startup jar package of mybatis. Replace it with mybatis plus boot starter, and comment out all related mybatis. The problem is solved.

[Solved] HTTP receives the return value using tencoding Utf8 encoding error “no mapping for the Unicode…”

Today, I saw the error message sent by the boss of wumoonlight and the subsequent solutions in the group. I feel that I may encounter it in the future. Record it so that I can’t deal with it in the future. Thank you, boss Guang.

The problem is that some of the data in the return value is encoded in utf8 and some are not. Use tencoding When receiving the return value in utf8 format, an error “no mapping for the Unicode…”

The solution is: replace tencoding.com with tutf8encodeex format UTF8

How to Solve Apple Mac installs Axure to open error for the first time

Axure RP 9 Chinese version for Mac cracked version (interactive product prototype design tool)

An error is reported when installing Axure on Apple Mac for the first time, as shown in the figure

I tried to uninstall and reinstall. Axure9 can’t be replaced with axure8. What should I do? Here is the solutions. Let’s have a look!

  1. Select the menu bar and go-personallyclick to enter

  1. Go to the home directory, shortcut key shiftcommand+ to >open hidden files, then find the file in this directory  .config, right-clickShow Introduction

  1. Add your currently logged-in account in Sharing and Permissions at the bottom

Verify whether it is added: Double-click the .config test after adding it. If it can be opened, the addition is successful. Note that read and write should be selected for the permissions on the right. Until you can open the config folder

  1. After opening, you will find that there is a configstore folder in it. You can’t open it by double-clicking. The steps to change the permissions are the same as before.

  1. After the configstore folder is opened, then open the Axure software

OK, it’s done

[Solved] mybatis Multi-Module Error: Invalid bound statement (not found)

2022-01-07 14:43:03.030 ERROR 18120 --- [schedule-pool-1] com.inkyi.system.service.SysLogService   : Invalid bound statement (not found): com.inkyi.system.mapper.SysOperLogMapper.insertSelective

org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.inkyi.system.mapper.SysOperLogMapper.insertSelective
	at org.apache.ibatis.binding.MapperMethod$SqlCommand.<init>(MapperMethod.java:235) ~[mybatis-3.5.9.jar:3.5.9]
	at org.apache.ibatis.binding.MapperMethod.<init>(MapperMethod.java:53) ~[mybatis-3.5.9.jar:3.5.9]
	at org.apache.ibatis.binding.MapperProxy.lambda$cachedInvoker$0(MapperProxy.java:108) ~[mybatis-3.5.9.jar:3.5.9]
	at java.base/java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1705) ~[na:na]
	at org.apache.ibatis.util.MapUtil.computeIfAbsent(MapUtil.java:35) ~[mybatis-3.5.9.jar:3.5.9]
	at org.apache.ibatis.binding.MapperProxy.cachedInvoker(MapperProxy.java:95) ~[mybatis-3.5.9.jar:3.5.9]
	at org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java:86) ~[mybatis-3.5.9.jar:3.5.9]
	at com.sun.proxy.$Proxy81.insertSelective(Unknown Source) ~[na:na]
	at com.inkyi.system.service.SysLogService.insertOperlog(SysLogService.java:21) ~[main/:na]
	at com.inkyi.framework.manager.AsyncFactory$1.run(AsyncFactory.java:88) ~[main/:na]
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
	at java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264) ~[na:na]
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java) ~[na:na]
	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[na:na]
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[na:na]
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[na:na]
	at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na]

The project modules are as follows:

Mapper.XML and mapper classes are 100% sure.

That’s the configuration problem. Check to scan mapper in the configuration file XML configuration

#mybatis
mybatis.type-aliases-package=com.inkyi.*.entity
mybatis.mapper-locations=classpath:mapper/*/*.xml

Set

mybatis. mapper-locations=classpath:mapper/*/*. xml

Change to

mybatis. mapper-locations=classpath*:mapper/*/*. xml

Classpath: only the class directory under this project will be scanned

Classpath *: class directory in jar package will be scanned

[Solved] Error processing tar file(exit status 1): open /src/wwwroot/emsadmin/styles.js.map: no space left on device

Exception generating project image

Exception:

Reason:

Docker is full by default

Use DF – h to view the storage of the disk

df -h /var

Use vgdisplay to view the expandable size

From the above, there is no expandable size

Solution:

Expansion disk

[Solved] error C1090: PDB API call failed, error code ‘0’

 

Error c1090

 

PDB API call failed with error code “error number”: message

Error while processing PDB files.

Error C1090 is a catch-all for an uncommon compiler PDB file error that is not reported separately. We have provided only general recommendations to resolve this issue.

Perform a cleanup operation in the generation directory, then perform a full generation of the solution.

Restart the computer or check in TaskManager for processes that are or are not responding to mspdbsrv.exe.

Close the antivirus check in the project directory.

/Zf If /MP is used with MSBuild or other parallel generation processes, use the compiler option.

Try to use 64-bit managed toolset generation.