Code Monkey home page Code Monkey logo

Comments (8)

RunningJon avatar RunningJon commented on July 18, 2024

from tpc-ds.

dimon777 avatar dimon777 commented on July 18, 2024

How many nodes? 6 datanodes, 8 nodes total
How much RAM per node? 8GB
How much swap per node? 16GB
How much RAM is being used by other processes? 1.3GB is used by CDH processes on each node. 6.5GB is free.
Are you using using YARN or the default resource manager? For Hawq the default RM is used.
Are you using randomly distributed tables? I am not sure about this, since it is TPC-DS generated tables
What are the values for these GUCs?
hawq_rm_stmt_vseg_memory 128mb
hawq_rm_memory_limit_perseg 4GB
default_hash_table_bucket_number 24
hawq_rm_nvseg_perquery_perseg_limit 6

OS settings:
vm.overcommit_ratio = 50
vm.overcommit = this parameter doesn't exist in CentOS7.1

Thank you.

from tpc-ds.

dimon777 avatar dimon777 commented on July 18, 2024

I see the code says this:

if ((context.resultRelationHashSegNum < context.externTableForceSegNum
		&& context.externTableForceSegNum != 0)
		|| (context.resultRelationHashSegNum < context.externTableLocationSegNum)) {
	elog(ERROR, "Could not allocate enough memory! "
		"bucket number of result hash table and external table should match each other");
}

What should I adjust to avoid this assertion? I don't know how to translate these context parameters to GUCs (if my case is indeed a misconfiguration).

from tpc-ds.

RunningJon avatar RunningJon commented on July 18, 2024

from tpc-ds.

dimon777 avatar dimon777 commented on July 18, 2024

Thank you,

I will try these. Unfortunately I have no control to use CDH or Horton or CentOS version (v7 seems to be supported: https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install) . In my tests I have not seen any issues with this configuration. My goal is to test HAWQ on the available Hadoop platform and compare it with Impala.

from tpc-ds.

RunningJon avatar RunningJon commented on July 18, 2024

from tpc-ds.

dimon777 avatar dimon777 commented on July 18, 2024

Yes, I am aware of Cloudera queries adjustment for Impala tpc-ds test. I also read Pivotal article on this. And I fully agree: the way Cloudera did this is unacceptable and misleading for those who rely on TPC-DS benchmark to make the judgment about the platform.

from tpc-ds.

dimon777 avatar dimon777 commented on July 18, 2024

The issue was in my environment .bashrc didn't have GREENPLUM_PATH variable set. After adding this:
export GREENPLUM_PATH=/usr/local/hawq/greenplum_path.sh
and changing RANDOM_DISTRIBUTION flag to true in ./rollout.sh call test started working fine.

from tpc-ds.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.