Aginity Workbench for Hadoop 4.6 Release Notes

These release notes summarize the changes in Aginity Workbench for Hadoop version 4.6. The release notes also contain system requirements and list the known issues with the release.

Hardware and Software Support

Aginity Workbench for Hadoop has been tested on Hive 1.1 (CDH 5.4), Hive 0.14 (HDP 2.2), Hive 0.12, and Big SQL (BigInsights 3).

System Requirements

  • Windows XP or later
  • Windows Installer 3.1
  • .NET Framework 3.5 SP1
  • The IBM Data Server Driver Package is installed on your client computer (for Big SQL connections).
  • Memory: 256 MB RAM
  • Disk space: 100 MB, plus any necessary space for .NET Framework and Windows Installer

Full List of Changes Since Aginity Workbench 4.5

Bug Fixes in Build 4.6.0.2339

Severity Reference Description Existed Since Version Resolution
Major WPD-574 All Hive queries time out after 120 seconds. This is caused by the Query timeout option in Workbench not being honored. 4.5 Fixed. Workbench was modified to honor the Query timeout.
Major WPD-563 Import into Workbench for Hive fails if a user chooses the INTEGER data type for the import. This issue is caused by Workbench supporting INTEGER, not the Hive-supported INT. 4.5 Fixed. INTEGER was replaced with INT.
Moderate WPD-548 When connected to the Hadoop cluster through Hive, a user tries to open Job Tracker Configuration and Task Tracker Configuration from the Server node in the Object Browser but gets an error. 4.5 Fixed.

Known Issues in Build 4.6.0.2339

Severity Reference Description Workaround
Major WPD-618 Import into Workbench for Hive fails if a user chooses the BIGDECIMAL or NUMERIC data type for the import. These data types are present in Workbench but are not supported by Hive. The BIGDECIMAL and NUMERIC data types will be removed in a future version of Workbench. For now, replace them with other appropriate data types for your import.
Moderate WPD-348 When connected to the Hadoop cluster through Big SQL, a user attempts to insert data into an existing table, but the INSERT statement returns a message saying "0 rows affected" although the rows are in fact inserted. N/A
Minor WPD-550 When connected to the Hadoop cluster through Big SQL, a user attempts to import a .csv file into an existing table and then chooses the Existing Table Action" of "Do Not Import." When the import correctly fails, the user clicks "Show More Details" in Job Monitor to get more detail on the failed rows but instead gets an error message. N/A
Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.

** Aginity, Inc.’s Provision of Scripts and Similar Materials at Help Desk Center. For the convenience of Aginity Amp™ clients, we provide code snippets, scripts and similar materials at this Help Desk Center. Such materials are reference materials provided for illustration purposes only. These are intended to serve as an example for self-service clients and are generally geared to respond to common questions asked by similar clients. Such materials constitute Aginity’s intellectual property. Aginity Amp clients and their authorized users are permitted to use these materials in connection with their software license and/or subscription of Aginity Amp. Nothing herein shall limit Aginity’s right to use, develop, enhance, modify or market any of these materials as part of its business. These materials are not formally supported by Aginity or its affiliates. Usage of these materials does not guarantee any specific results, uptime, performance or error-free operation. Aginity disclaims all warranties of any kind, whether express, implied, statutory or otherwise, including any implied warranty of merchantability or fitness for a particular purpose.

Powered by Zendesk