This udf package can be used with our impala udfs, which can be found here.
All methods are in the package: hive_udfs
UDFMD5 (arg:String) -> String (Hex)
UDFSHA1 (arg:String) -> String (Hex)
UDFSHA2 (arg:String) -> String (Hex)
UDFKeyGen128 (arg:id) -> String (Hex)
UDFKeyGen256 (arg:id) -> String (Hex)
Pass an id, to ensure different results for each call. Otherwise hive is not able to reset the salt.
arg: Text to encrypt/decrypt, arg1: password or key in hex
UDFAES128Encrypt (arg:String, arg1:String (Hex) ) -> String (Hex)
UDFAES128Decrypt (arg:String, arg1:String (Hex) ) -> primitive type
UDFAES256Encrypt (arg:String, arg1:String (Hex) ) -> String (Hex)
UDFAES256Decrypt (arg:String, arg1:String (Hex) ) -> primitive type
Download the Java JCA Jar and add it via external Jar to the project. Download the necessary Hadoop/Hive Jars and add it via external JAR to your project.
- Create a JAR from the UDFS package.
- Upload the JAR f.e. via SCP to the cluster
scp -r -P2222 <path>/<udf.jar> <user>@<address>:/<path>
- Register the JAR to the class path via hive shell
ADD JAR <path>/<udf.jar>;
- The registered JAR should be visible via:
LIST JARS;
- Create a function from the regisitered JAR.
CREATE TEMPORARY FUNCTION <alias> AS '<package>.<method>';
- After you can test your functions as usual:
SELECT encaes128("scalefree.com",hex("secret"));