Interacting with IDA through IPC channels

I’m happy to present you a guest post by David Zimmer <dzzie@yahoo.com>. The approach he describes can be used to develop plugins more conveniently (but not limited to that):


In this article we are going to discuss a mechanism that can be used to interact with IDA through external applications.

The reason this technique was developed was to provide a convenient way for utility applications to query information from IDA databases, and automate its interface.

Over the years, several methods have been tried such as pipes, and sockets. In the end, the easiest Inter-Process Communication (IPC) technique I have found is the Windows specific SendMessage API along with the WM_COPYDATA message. This technique was chosen for its simplicity, reliability, and its inherently synchronous nature.

With this technique, the IDA server plugin creates a window and watches for command messages to come in.


(abbreviated C source to CreateWindow and receive messages)

The plugin contains its own text based API which can be used to automate actions and return queried data back to the calling process. Example commands and data transfer routines are shown below:


(hwnd arguments specify which window handle receives data callback)

There are several advantages to this technique. Traditional plugin development usually requires compiling your code and launching the plugin from the host application.
Debugging often entails wading through runtime debug logs, inferring problems through observed behavior, and chasing crashs in a debugger.

This technique allows you to debug your executable in the development IDE as normal, harnessing all of its powerful capabilities such as edit and continue, call stack viewing, variable watches, breakpoints etc.

Complex projects can be run directly while you iron out interface behaviors and are not dependant upon the host. Datasets can even be loaded from test files so that many routines can be debugged independently of IDA.

One project where this technique was put to good use was an experiment to see if a Javascript IDE could be created for IDA automation tasks. The desire was for a scriptable interface that supported intellisense, function prototype tool tips, and syntax highlighting.


 

With a project such as this, coding for the IDE behaviors is a major part of the development task. To develop such an application strictly as a plugin would be a daunting endeavor. Developing it as a standalone application was a great advantage where it could be run directly from the Visual Studio IDE without any intermediate steps.

One other advantage to this technique is that it opens up plugin development to include higher level languages that do not have native support for the IDA API*. Languages such as C#, Delphi, and VB6 can easily interact with IDA through this mechanism. These languages have excellent rapid GUI development capabilities and a wealth of complex components already available to them. This technique is even open to Java developers through the JNI.

Once the intermediate API and the client access library is written, creating utilities to integrate with IDA becomes a pretty quick process. Another example project that has come in handy is a Wingraph32 replacement that was coded in about a day.

The interface shown below automatically syncs the IDA disassembly when graph nodes are clicked and can perform several renaming operations.


 

There are some limitations to this technique as well. The particular implementation detailed in this article is Windows specific. It also requires both applications to be running on the same machine.

Another consideration is that the data is crossing process boundaries which can be a performance hit depending on what is being transferred and how often it is being invoked. For example, extracting or patching large blocks of data would best be optimized by reserving the use of the SendMessage API as the command channel, and utilizing files or shared memory for the data transfer. This will likely be an optimization included in future revisions.

The example IDASRVR project linked to below currently supports 36 API and uses a simple text based command protocol. Sample code is provided in C and C#. Client libraries are also available for C# and VB6 in the associated projects below.

Sample Projects:

IDASrvr – IDA IPC Server example

IDA_Jscript – Javascript IDE PoC for IDA (VB6)

GleeGraph – C# Wingraph32 replacement

* You can create C# and VB6 in process plugins for IDA through COM.
This was my first approach, and was used in my
IDACompare plugin.

Loading your own modules from your IDAPython scripts with idaapi.require()

TL;DR

If you were using import to import your own “currently-in-development” modules from your IDAPython scripts, you may want to use idaapi.require(), starting with IDA 6.5.

Rationale

When using IDAPython scripts, users were sometimes facing the following issue

Specifically:

  • User loads script
  • Script imports user’s module mymodule
  • Script ends
  • User modifies code of mymodule (Note: the module is modified, not the script)
  • User reloads script
  • Modifications to mymodule aren’t taken into consideration.

While that’s perfectly understandable (the python runtime doesn’t have to reload mymodule if it has been compiled & loaded already), this is somewhat of an annoyance for users that were importing modules that were often modified.

IDA <= 6.4: Ensuring a user-specified module gets reloaded, by destroying it.

Up until IDA 6.4, the IDAPython plugin would do some magic after you have run your user script.
(click “expand all” to reveal the diff)

The sequence becomes:

  • User loads script
  • Script imports user’s module mymodule
  • Script ends
  • [module mymodule is deleted]
  • User modifies code of mymodule
  • User reloads script
  • Modifications to mymodule are taken into consideration, since module was deleted.

Unfortunately we have to stop doing this because:

  • That prevents us from using python-based hooks to be used after the script is finished (see below).
  • That goes against the rest of the python philosophy (i.e., modifications to objects are not reverted), and is therefore unexpected.

Issues with hooks.

Imagine you have the following script, dbghooks.py:

from idaapi import *
import mydbghelpers

class MyHooks(DBG_Hooks):

  def __init__(self):
    ...

  def dbg_bpt(self, tid, ea):
    mydbghelpers.do_something()
    return 0

  def dbg_step_into(self):
    ...

hooks = MyHooks()
hooks.hook()
  • User loads script
  • Scripts imports mydbghelpers
  • Script creates instance of MyHooks, and hooks it into IDA’s debugger APIs
  • Script ends
  • [module mydbghelpers is deleted]
  • User runs debugger, and a breakpoint is hit. Two things can happen:
    • The hook fails executing
    • IDA crashes (that can happen if the form from mydbghelpers import * was used)

IDA > 6.4: Introducing idaapi.require()

Everywhere else in python, when you modify a runtime object, those changes will remain visible.

We decided it would be better to not go against that standard behaviour anymore, and provide a helper to achieve the same results as what was achieved before with the deletion of user modules.

You can now import & re-import of a module with: idaapi.require(name)

Here is its definition:

def require(modulename):
    if modulename in sys.modules.keys():
        reload(sys.modules[modulename])
    else:
        import importlib
        import inspect
        m = importlib.import_module(modulename)
        frame_obj, filename, line_number, function_name, lines, index = inspect.stack()[1]
        importer_module = inspect.getmodule(frame_obj)
        if importer_module is None: # No importer module; called from command line
            importer_module = sys.modules['__main__']
        setattr(importer_module, modulename, m)
        sys.modules[modulename] = m

Example

The example debugger hooks script above becomes:

from idaapi import *
idaapi.require("mydbghelpers")

class MyHooks(DBG_Hooks):

  def __init__(self):
    ...

  def dbg_bpt(self, tid, ea):
    mydbghelpers.do_something()
    return 0

  def dbg_step_into(self):
    ...

hooks = MyHooks()
hooks.hook()

I.e., only the second line changes.

Installing PIP packages, and using them from IDA on a 64-bit machine

Recently, one of our customers came to us asking how he should proceed to be able to install python packages, using PIP, and use those from IDA.

The issue he was facing is that his system is a 64-bit Ubuntu 12.04 VM.
Therefore using the Ubuntu-bundled PIP will just result in installing the desired package (let’s say ssdeep) for the system Python runtime, which is a 64-bit runtime and therefore not compatible with IDA.

The best (as in: cleanest) solution I have found is to:

  • build a 32-bits python on the system.
  • pip-install packages in that 32-bits python’s sub-directories.
  • export PYTHONPATH to point to the 32-bits python’s sub-directories.

We figured we’d write it down here just in case it might help others.

Prerequisites

  • Install autoconf
  • Install ia32-libs

Building & installing a 32-bits python

  • ..$ export LD_LIBRARY_PATH=/lib/i386-linux-gnu/:/usr/lib32:$LD_LIBRARY_PATH
  • Download Python2.7.4
    • Note:You should make sure that the MD5 checksum and the size of the file you downloaded match those that are advertised on the page. That would prevent a man-in-the-middle attacker from providing you with a malicious Python bundle.
  • Build it. Note that you’ll probably have to sudo-create a few symlinks. I had to do this, on the Ubuntu 12.04 64-bit VM I tested this on:
    • /lib/i386-linux-gnu/libssl.so/lib/i386-linux-gnu/libssl.so.1.0.0
    • /lib/i386-linux-gnu/libcrypto.so/lib/i386-linux-gnu/libcrypto.so.1.0.0
    • /lib/i386-linux-gnu/libz.so/lib/i386-linux-gnu/libz.so.1
  • For the sake of completeness, here are my build commands (don’t forget the flags, of course):
    • ..$ CFLAGS=-m32 LDFLAGS=-m32 ./configure --prefix=/opt/Python2.7.4-32bits
    • ..$ CFLAGS=-m32 LDFLAGS=-m32 make -j 8

Once the build completes

Here’s what I have as last lines of the build:

INFO: Can't locate Tcl/Tk libs and/or headers

Python build finished, but the necessary bits to build these modules were not found:
_bsddb             _curses            _curses_panel
_sqlite3           _tkinter           bsddb185
bz2                dbm                gdbm
readline           sunaudiodev
To find the necessary bits, look in setup.py in detect_modules() for the module's name.

If you see, below that, that it failed to build, say 'binascii', then something went wrong.

Make sure you run make -j 1 to check out what went wrong (i.e., what library it claims not being able to find)

Once you have succesfully built your 32-bits Python, it’s time to install it: sudo make install

Trying your freshly-built python

..$ /opt/Python2.7.4-32bits/bin/python2.7
Python 2.7.4 (default, Apr 26 2013, 16:03:38)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import binascii
>>>

No complaint so far. Good.

Checking that pkg_resources is available.

Try importing pkg_resources. If it fails, you’ll probably have to do the following:

..$ cd /tmp
..$ curl -O http://python-distribute.org/distribute_setup.py
..$ less distribute_setup.py  # (*)
..$ sudo /opt/Python2.7.4-32bits/bin/python2.7 distribute_setup.py

That will print out quite a fair amount of info, and should succeed.

(*) Note: A careful reader has pointed out that it would be fairly easy to intercept (man-in-the-middle) such an HTTP request, and serve malicious content that would then be piped (as root) to Python.
That’s why I think it’s important to mention, as a third step (i.e., less ...), that the code that was downloaded should ideally be checked. Hopefully, http://python-distribute.org will soon provide HTTPS support, which will limit such MITM attack risks.

Trying your freshly-built python, again

We want to make sure pkg_resources can be imported.

..$ /opt/Python2.7.4-32bits/bin/python2.7
Python 2.7.4 (default, Apr 26 2013, 16:03:38)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import pkg_resources
>>>

Still no complaint. Good.

If yours complains, you’ll have to first make sure you fix whatever is causing it to fail, because the next will not work without that.

Installing PIP for your new Python build

Since using your system’s PIP will probably not work (as it would build & install things in a 64-bits python sub-directory), you’ll have to install a PIP package specifically for your freshly-built Python.

Here’s how I proceeded:

..$ cd /tmp;
..$ curl -O https://raw.github.com/pypa/pip/master/contrib/get-pip.py;
..$ sudo /opt/Python2.7.4-32bits/bin/python2.7 get-pip.py

PIP is now installed.

PIP-installing a package (i.e., ssdeep)

To download/build/install the ssdeep package I ran, as root (either that, or you’ll have to give your user the rights to write in /opt/Python2.7.4-32bits):

..$ su
Password:
root ..$ export CFLAGS=-m32
root ..$ export LDFLAGS=-m32
root ..$ export LD_LIBRARY_PATH=/lib/i386-linux-gnu/:/usr/lib32:$LD_LIBRARY_PATH
root ..$ /opt/Python2.7.4-32bits/bin/python2.7 /opt/Python2.7.4-32bits/bin/pip install ssdeep

Notice how I use my freshly-built python, with my fresly-installed PIP (and not the system one.)

Note: Don’t forget the export lines, or PIP will partially build stuff for x64, and partially for x86. That, as you can guess, won’t quite work.

If you forgot the export lines and started building anyway (and the build failed because of the mixed architecture issue I just wrote about), make sure you delete whatever is in /tmp/pip-build-*, so that there won’t be stale object files of inappropriate architecture in there.

Check out the PIP-installed package works

..$ /opt/Python2.7.4-32bits/bin/python2.7
Python 2.7.4 (default, Apr 26 2013, 16:03:38)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import ssdeep
>>> ssdeep
<module 'ssdeep' from '/opt/Python2.7.4-32bits/lib/python2.7/site-packages/ssdeep.so'>
>>> dir(ssdeep)
['Error', '__all__', '__builtins__', '__doc__', '__file__', '__name__', '__package__', '__test__', '__version__', 'compare', 'hash', 'hash_from_file', 'sys']
>>>

So far so good.

Testing the PIP-installed package in IDA

Since that’s still the goal (though you might have forgotten by now, given the amount of directions above.. 😉 ),
we’ll now try and make use of that PIP-installed package in IDA.

  • ..$ export PYTHONPATH=/opt/Python2.7.4-32bits/lib/python2.7/site-packages:/opt/Python2.7.4-32bits/:$PYTHONPATH
  • ..$ idaq

If all went well, typing import ssdeep in the Python input line should properly, silently, nicely import the package.

Recon 2012: Compiler Internals

This year I again was lucky to present at Recon in Montreal. There were many great talks as usual. I combined the topic of my last year’s talk on C++ reversing and my OpenRCE article on Visual C++ internals. New material was implementation of exceptions and RTTI in MSVC x64 and GCC (including Apple’s iOS).

The videos are not up yet but here are the slides of my presentation and a few demo scripts I made for it to parse GCC’s RTTI structures and exception tables. I also added my old scripts from OpenRCE which I amended slightly for the current IDA versions (mostly changed hotkeys).

Slides
Scripts

Calling IDA APIs from IDAPython with ctypes

IDAPython provides wrappers for a big chunk of IDA SDK. Still, there are some APIs that are not wrapped because of SWIG limitations or just because we didn’t get to them yet. Recently, I needed to test the get_loader_name() API which is not available in IDAPython but I didn’t want to write a full plugin just for one call. For such cases it’s often possible to use the ctypes module to call the function manually.

The IDA APIs are provided by the kernel dynamic library. In Windows, it’s called ida.wll (or ida64.wll), in Linux libida[64].so and on OS X libida[64].dylib. ctypes provides a nice feature that dynamically creates a callable wrapper for a DLL export by treating it as an attribute of a special class instance. Here’s how to get that instance under the three platforms supported by IDA:

import ctypes
idaname = "ida64" if __EA64__ else "ida"
if sys.platform == "win32":
    dll = ctypes.windll[idaname + ".wll"]
elif sys.platform == "linux2":
    dll = ctypes.cdll["lib" + idaname + ".so"]
elif sys.platform == "darwin":
    dll = ctypes.cdll["lib" + idaname + ".dylib"]

We use “windll” because IDA APIs use stdcall calling convention on Windows (check the definition of idaapi in pro.h).

Now we just need to call our function just as if it was an attribute of the “dll” object. But first we need to prepare the arguments. Here’s the declaration from loader.hpp:

idaman ssize_t ida_export get_loader_name(char *buf, size_t bufsize);

ctypes provides a convenience functions for creating character buffers:

buf = ctypes.create_string_buffer(256)

And now we can call the function:

dll.get_loader_name(buf, 256)

To retrieve the contents of the buffer as a Python byte string, just use its .raw attribute. The complete script now looks like this:

import ctypes
idaname = "ida64" if __EA64__ else "ida"
if sys.platform == "win32":
    dll = ctypes.windll[idaname + ".wll"]
elif sys.platform == "linux2":
    dll = ctypes.cdll["lib" + idaname + ".so"]
elif sys.platform == "darwin":
    dll = ctypes.cdll["lib" + idaname + ".dylib"]
buf = ctypes.create_string_buffer(256)
dll.get_loader_name(buf, 256)
print "loader:", buf.raw

ctypes offers many means to interface with C code, so you can use it to call almost any IDA API.

The trace replayer

One of the new features that will be available in the next version of IDA is a trace re-player. This pseudo-debugger allows to re-play execution traces of programs debugged in IDA. The replayer debugger allows replaying traces recorded with any of the currently supported debuggers, ranging from local Linux or win32 debuggers to remote GDB targets. Currently supported targets include x86, x86_64, ARM, MIPS and PPC.

When we are re-playing a recorded trace, we can step forward and backward, set breakpoints, inspect register values, change the instruction pointer to any recorded IP, etc…

Also, trace management capabilities have been added to IDA in order to allow saving and loading recorded execution traces. Let’s see an example.

Continue reading The trace replayer

New features in Hex-Rays Decompiler 1.6

Last week we released IDA 6.2 and Hex-Rays Decompiler 1.6. Many of the new IDA features have been described in previous posts, but there have been notable additions in the decompiler as well. They will let you make the decompilation cleaner and closer to the original source. However, it might be not very obvious how to use some of them, so we will describe them in more detail.

1. Variable mapping

This is probably the simplest new feature and can be used without any extra preparation.

Sometimes the compiler stores the same variable in several places (e.g. a register and a stack slot). While the decompiler often manages to combine such locations, sometimes it’s not able to prove that they always contain the same value (especially in presence of calls that take address of stack variables). In such cases the user can help by performing such a merge or mapping manually.

Consider the following very common case:

int __stdcall SciFreeFilterInstance(_FILTER_INSTANCE *pFilterInstance)
{
  _FILTER_INSTANCE *v1; // esi@1

  v1 = pFilterInstance;
  if ( pFilterInstance->Signature != 'FrtS' )
    RtlAssert(
      "(pFilterInstance)->Signature==SIGN_FILTER_INSTANCE",
      "d:\\xpsprtm\\drivers\\wdm\\dvd\\class\\codinit.c",
      0x17A2u,
      0);
  StreamClassDebugPrint(2, "Freeing filterinstance %p still open streams\n", v1);

The compiler copied an incoming argument (pFilterInstance) into a register (v1==esi). To get rid of the extra name, right-click the left-hand variable and choose “Map to another variable”, or place cursor on it and press ‘=’:

mapvar2

Choose the right-hand variable from the list.

mapvar3

Once decompilation is refreshed, both the left-hand variable (v1) and the assignment are gone. Now we have only one variable – the incoming argument.

int __stdcall SciFreeFilterInstance(_FILTER_INSTANCE *pFilterInstance)
{
  if ( pFilterInstance->Signature != 'FrtS' )
    RtlAssert(
      "(pFilterInstance)->Signature==SIGN_FILTER_INSTANCE",
      "d:\\xpsprtm\\drivers\\wdm\\dvd\\class\\codinit.c",
      0x17A2u,
      0);
  StreamClassDebugPrint(2, "Freeing filterinstance %p still open streams\n",
    pFilterInstance);

You can map several variables to the same name, if necessary.

Made a mistake or mapped too much? It’s simple to fix. Right-click the wrongly mapped name and choose “Unmap variables”. Then choose the variable you want to see again.

2. Union selection.

This feature, naturally, only applies to unions. That means that you need to have union types in your database and assign the types to some variables or fields.

Normally the decompiler tries to choose a union field which matches the expression best, but sometimes there are several equally valid matches, and sometimes other types in the expression are wrong. In such cases, you can override the decompiler’s decision. For example, this code is common in Windows drivers:

NTSTATUS __stdcall DispatchDeviceControl(PDEVICE_OBJECT DeviceObject, PIRP Irp)
{
  PIO_STACK_LOCATION stacklocation; // ebx@1

  stacklocation = Irp->Tail.Overlay.CurrentStackLocation;
  if ( *&stacklocation->Parameters.Create.FileAttributes == 0x224010 )
  {
    v8 = stacklocation->Parameters.Create.Options == 20;
    if ( !v8 )
      goto LABEL_18;
    if ( stacklocation->Parameters.Create.SecurityContext < 1 )
      goto LABEL_87;
    v23 = Irp->AssociatedIrp.MasterIrp;

Since we know we’re in a DeviceControl handler, it’s likely the code is inspecting the Parameters.DeviceIoControl substructure and not Parameters.Create.

Right-click the field and choose “Select union field”, or place cursor on it and press Alt-Y.

selunion2

Choose the Parameters.DeviceIoControl.IoControlCode field.

selunion3

Other references to Parameters.Create can be fixed the same way. The updated decompilation makes more sense:

NTSTATUS __stdcall DispatchDeviceControl(PDEVICE_OBJECT DeviceObject, PIRP Irp)
{
  PIO_STACK_LOCATION stacklocation; // ebx@1

  stacklocation = Irp->Tail.Overlay.CurrentStackLocation;
  if ( stacklocation->Parameters.DeviceIoControl.IoControlCode == 0x224010 )
  {
    v8 = stacklocation->Parameters.DeviceIoControl.InputBufferLength == 20;
    if ( !v8 )
      goto LABEL_18;
    if ( stacklocation->Parameters.DeviceIoControl.OutputBufferLength < 1 )
      goto LABEL_87;

3. CONTAINING_RECORD macro

This macro is commonly use in Windows drivers to get a pointer to the parent structure when we have a pointer to one of its fields.

For example, consider these two structures, used in a driver:

struct _HW_STREAM_OBJECT {
  ULONG  SizeOfThisPacket;
  ULONG  StreamNumber;
  PVOID  HwStreamExtension;
  ...
} HW_STREAM_OBJECT, *PHW_STREAM_OBJECT;

struct _STREAM_OBJECT
{
  _COMMON_OBJECT ComObj;
  _FILE_OBJECT *FilterFileObject;
  _FILE_OBJECT *FileObject;
  _FILTER_INSTANCE *FilterInstance;
  _HW_STREAM_OBJECT HwStreamObject;
  ...
};

The following function accepts a pointer to _HW_STREAM_OBJECT:

void __cdecl StreamClassStreamNotification(
  int NotificationType,
  _HW_STREAM_OBJECT *StreamObject,
  _HW_STREAM_REQUEST_BLOCK *pSrb,
  _KSEVENT_ENTRY *EventEntry,
  GUID *EventSet,
  ULONG EventId);

But immediately converts it into the containing _STREAM_OBJECT:

mov     eax, [ebp+StreamObject]
test    eax, eax
push    ebx
push    esi
lea     esi, [eax-_STREAM_OBJECT.HwStreamObject]

Default decompilation doesn’t look great:

  char *v6; // esi@1
  v6 = (char *)&StreamObject[-2] - 36;

There are two ways to make it nicer:

  1. Change type of v6 to be _STREAM_OBJECT*. The decompiler will detect that the expression “lines up” and convert it to use the macro.
  2. Right-click on the delta being subtracted (-36), select “Structure offset” and choose _STREAM_OBJECT from the list.

In both cases you should get a nice expression:

  v6 = CONTAINING_RECORD(StreamObject, _STREAM_OBJECT, HwStreamObject);

N.B.: currently you need to refresh the decompilation (press F5) to see the changes. We’ll improve it to happen automatically in future.

4. Kernel and user-mode macros involving fs segment access.

On Windows, the fs segment is used to store various thread-specific (for user-mode) or processor-specific (for kernel mode) data. Hex-Rays Decompiler 1.6 detects the most common ways of accessing them and converts them to corresponding macros. However, this functionality requires presence of specific types in the database. For user mode, it is the _TEB structure, for kernel mode it’s the KPCR structure.

For example, consider the following code:

mov     eax, large fs:18h
mov     eax, [eax+30h]
push    24h
push    8
push    dword ptr [eax+18h]
call    ds:__imp__RtlAllocateHeap@12 ; RtlAllocateHeap(x,x,x)
mov     esi, eax

If you don’t have the _TEB structure in types, this will be decompiled to:

  v5 = RtlAllocateHeap(*(_DWORD *)(*(_DWORD *)(__readfsdword(24) + 48) + 24), 8, 36);

However, if you do add the type, it will look much nicer:

  v5 = RtlAllocateHeap(NtCurrentTeb()->ProcessEnvironmentBlock->ProcessHeap, 8, 36);

Currently we support the following macros:

Macro Required types
NtCurrentTeb _TEB
KeGetPcr KPCR
KeGetCurrentPrcb KPCR, KPCRB
KeGetCurrentProcessorNumber KPCR
KeGetCurrentThread KPCR, _KTHREAD

Hint: the easiest way to get _TEB or KPCR types into your database is using the PDB plugin. Invoke it from File|Load file|PDB file…, enter a path to kernel32.dll (for user-mode code) or ntoskrnl.exe (for kernel-mode code), and check the “Types only” checkbox.

kernpdb

PDBs for those two files usually contain the necessary OS structures.

We hope you will like these new additions. Note that the version 1.6 includes even more improvements and fixes, see the full list of the new features and the comparison page.

IDA Pro 6.2 beta

Soon we are going to start testing the next IDA version. There will be many improvements. Some of them we have mentioned previously:

Proximity view
PE+ support for Bochs (64-bit PE files)
UI shortcut editor
Filters in choosers
Database snapshots

Other new major features:

  • GUI installers for Linux and OS X


  • Automatic check for new versions:

  • Cross-references to structure members:

  • Floating licenses: our licensing system is now more flexible and allows big enterprises to purchase floating licenses. Contact sales@hex-rays.com for more information.

If you have an active license and would like to test the beta, please send a message to support@hex-rays.com.