These release notes summarize the changes in Aginity Workbench for Hadoop version 4.9. The release notes also contain system requirements and list the known issues with the release.
What's New? (Also See Full List of Changes)
Aginity Workbench version 4.9 contains a number of bug fixes.
Technical Details and Changes
This section contains details of specific functions and changes that are included in Aginity Workbench 4.9.
Upgrading from Prior Versions
Workbench automatically checks for new updates and displays a dialog box if an update is available. In the dialog box, select Install this update now to download and install the update.
You can also choose to postpone installation of the update until the next time you run Workbench or suppress the update alert.
You can also check for updates manually by clicking Check for Updates on the Help menu and then install an available update by following the instructions in a wizard.
Hardware and Software Support
Aginity Workbench has been tested on Hive 1.1 (CDH 5.4), Hive 0.14, Hive 0.12, and Big SQL (BigInsights 3 and 4).
System Requirements
- Windows XP or later
- Windows Installer 3.1
- .NET Framework 4.6.1
- The IBM Data Server Driver Package is installed on your client computer (for Big SQL connections).
- Memory: 256 MB RAM
- Disk space: 100 MB, plus any necessary space for .NET Framework and Windows Installer
Full List of Changes Since Aginity Workbench 4.8
Bug Fixes in Build 4.9.1.2686
Severity | Reference | Description | Existed Since Version | Resolution |
---|---|---|---|---|
Major | 16307 | When a user attempts to import a file into a Hive table using the Data Import Wizard, the wizard displays the first HDFS connection created in Query Analyzer, and that connection cannot be changed. If the preselected HDFS connection and the Hive connection selected by the user point to different clusters, the import results in an empty target table although the Job Monitor reports success. | 2.5.1 | A dialog box allowing the user to select an HDFS connection for import was added as the first step of the wizard. If a selected HDFS connection and the Hive connection are located on different clusters, the Job Monitor reports failure and provides a corresponding warning. |
Moderate | 15391 | The User ID field is not visible on the connection dialog box after a user changes the 100% display setting to a larger setting. | 2.5 | The User ID field is visible on all display settings. |
Moderate | 16277 | When a user executes an SQL statement to load data from a file into a table using the LOAD DATA INPATH command, a "Null Pointer Exception" error generated by Hive appears in the Output window. However, the data is successfully loaded into the table. | 2.5.3 | The error message is now suppressed. |
Minor | 16045 | The Discussion Forums menu item exists for the deprecated Aginity Forums. | 2.4 | The menu item was modified to go to the Aginity Community web forums. |
Minor | 15089 | Importing a file into a nonexisting table on Hive where metastore connection information is not provided fails. | 2.5 | Fixed. |
Minor | 16311 | When a user runs a query to export a table to Hive in Query Analyzer, a Java error message appears in the Output window although the export is completed successfully. | 2.5.1 | The error message is suppressed in this situation. |
Known Issues in Build 4.9.1.2686
Severity | Reference | Description | Workaround |
---|---|---|---|
Moderate | WPD-348 | When connected to the Hadoop cluster through Big SQL, a user attempts to insert data into an existing table, but the INSERT statement returns a message saying "0 rows affected" although the rows are in fact inserted. | N/A |
Moderate | WPD-819 | Generate Insert DML command (available from the Advanced menu) does not function for Hive tables. You will receive an error indicating the there was an error when compiling the statement. | N/A |
Trivial | APD-3711 | When the field type is Date, the year 0 (for example, 0000-01-01) causes an error “Year, Month, and Day parameters describe an un-representable DateTime.” | Either cast the date as a VARCHAR or do not use the year 0. |