Hardware

Overview


Our test relies on fluorescence caused by fluorophore-quencher separation to signify a positive bTB result. This fluorescence is invisible to the naked eye, therefore a reliable and highly sensitive method of photodetection is required. Our hardware is a low-cost, LED based fluorometer used to measure the fluorescence. A prototype, open-source Lock-In Amplifier made from affordable and accessible electronic components, is used in conjunction with a mobile phone app to provide rapid feedback to users for our test. The frame to hold the optical and electronic components is 3D printed using a sustainable PLA.


circuit board and 3D printed unit circuit board and 3D printed unit circuit board and 3D printed unit

Figure 1: Images of the hardware. A 3D printed casing holds the LED (used to excite the fluorophores), a bandpass filter, the sample in a cuvette and the printed circuit board. The circuit board has a microcontroller which takes inputs from the amplified photodiode and performs a lock-in amplification algorithm, the outputs of which are sent to a device where they are displayed.

simplistic and abstract diagram of our hardware components (LED, cuvette, filter, photodiode, microcontroller) which works on the principles of lock in apmplification

Figure 2: Overview of the hardware system. The cuvette containing the sample mixed probes is excited by an LED, causing the probes to fluoresce. This light then passes through the Band-Pass filter, reducing the LED light that will reach the sensor, meaning the fluorescent signal can then measured with reduced noise. The Sensor is an amplified photodiode that converts the fluorescent signal into an electrical signal.

Hardware Theory and Design


Our hardware has been adapted from a paper by A.J. Harvie, S.K. Yadav, and J.C de Mello [1], which produced open-source files to be able to produce a low-cost Lock-in amplifier. 3D printing was used to produce casing for both the printed circuit board (PCB), optical components and the sample being tested.Listed below is the Bill of Materials, in total the cost of production for the Hardware was below £200.

Click for Bill of Materials list of the components that were ordered and used to construct the hardware including their prices

Summary


Our hardware uses Lock-in Amplification to amplify low fluorescent signals from a noisy background. This is especially important for use in the field where there can be ambient light interference. A light emitting diode (LED) flashing at a fixed frequency is used to excite the fluorophore present in the probes, this light is passed through a band-pass filter (which only allows light within a specific wavelength range through), where it is detected by an amplified photodiode. This is when the lock-in amplification process begins; the signal is prepared by the amplified photodiode so that it can be sent to the microcontroller and read as a digital signal. The analogue signal is sampled by the microcontroller, where it is multiplied by 2 reference signals, then processed by a digital low-pass filter (exponential averaging) and then added together. The final output is then sent to a Bluetooth module, which transmits the data to a phone to the be plotted on an Intensity-Time graph giving the user instant feedback.

Specifics


Analogue signal processing: The amplified photodiode (OPT101) is used to detect light and convert it into an electrical signal. The signal by the photodiode needs to be primed for the microcontroller’s (Teensy 4.0) analogue to digital converter (ADC). This takes three separate steps. Firstly, feedback compensation is performed to allow the OPT101 to operate when under ambient light or any other electrical interference with the OPT101 contacts, without saturating the signal transmitted to the ADC. Secondly, the summing amplifier converts the signal in the range of -1.6 V to +1.6 V to a positive range of 0 V to +3.2 V, this allows the signal to be sampled by the ADC, without removing any negative voltages from the original signal. Finally, the signal is passed through a low-pass filter, that removes noise from high-frequency components of 94 kHz [1].

Digital Lock-In Amplification: The analogue signal of frequency f0 is sent to the built-in ADC of the microcontroller. The ADC samples the signal at successive discrete times, t(1),t(2),t(3)… of fixed time interval, ∆t between them. Each sample, S(n), where n is the sample number, is obtained as a 12-bit integer value by the ADC, which is then converted to a 64-bit double floating-point value (this means fractional values can be stored and processed). After each sample is obtained the reference signals Q_x (n) and Q_y (n) are updated:

equation for reference signal; cosine wave and sine wave

Where m is also an integer. The LED driver is turned on and off every m/2 steps, i.e. when Qy(n)=0. This leads to the fluorescent signal having a constant phase difference and the same frequency as the reference signals. An intermediary step is used to obtain two unfiltered intermediate outputs X0 (n) and Y0 (n), by multiplying the updated reference signals by the current sample:

equation for updating an output which will later be filtered

The signal is then exponentially smoothed (averaged), essentially passing the signal through a low-pass filter but computationally, yielding intermediary variables X1 (n) and Y1 (n):

equation for the first digital filtering process

α is a weighting constant that determines the cut-off frequency (and hence time-constant) of the low-pass filter[2]. Filtering is then repeated to attenuate non-DC interferences and hence lead to a less noisy fluorescent signal:

equation for second digital filtering process

The output signal,R(n), is then finally produced by vectorially adding X2 (n) and Y2 (n):

equation for the final output signal by vectorially adding the two previous filter outputs

R(n) has the same amplitude as the first harmonic of the fluorescent signal [1,3], hence it is proportional to the average fluorescence intensity of our sample with the probes. There is still residual LED light leakage which increases the intensity however, this has mostly been filtered by the bandpass filter and then further removed using exponential filtering.

The microcontroller then transmits this to a Bluetooth module via a micro-USB. The Bluetooth module will then send this data to a mobile phone, running an application that processes this data.

diagram of the algorithm for the digital lock in amplification process

Figure 3: (left to right), the analogue signal is first sampled by the Teensy’s ADC, where the signal is split into two part, with each part being multiplied by the reference signals Qx (n) and Qy (n). The update signals are then filtered twice by using exponential averaging. The final is output is obtained by vectorially adding the signals together.



Attributions


The design of this hardware also involved the input of industry professionals to ensure that it was sensitive and of low expense. We spoke to Prof Julian Moger and Dr Ben Gardner whose insight and information were invaluable to the design of our hardware. The information from these discussions were broken down using the ICCD framework developed for our stakeholder interviews, as ultimately these participants were stakeholders in our project design.



Prof Julian Moger

Intention:

Discussion with photonics expert Prof Julian Moger about sensitive methods of fluorescence detection as we were having issues regarding sensitivity of the Silicon-based photodiodes.

Contribution:

Julian provided a paper by the Norwegian University of Science and Technology (NTNU), where a team of researchers were able to produce a low-cost photodetector that worked on the principles of Lock-in Amplification[1].

Conclusion:

We now had a good starting point to re-design the hardware. Since it was open source we were then able to re-produce the hardware and adapt it for our own needs.

Direction:

This led to research Lock-in Amplification as an alternative to a mini-spectrometer. As we thought it felt more in-line with our Team's principle of economic accessibility. Julian also pointed us to Dr Ben Gardner.


Dr Ben Gardner

Intention:

Discussion with biophotonics expert Dr Ben Gardner, who also has experience in developing lower-cost methods of photodetection.

Contribution:

We discussed the NTNU paper. He also sent another paper which produced another photodetector using a mini spectrometer used by photographers. Although spectrometery was a path, we previously stopped to explore any further, the paper did use a Bluetooth module in conjunction with a phone app to give a readout to the user[3].

Conclusion:

The idea of using a Bluetooth module and phone app was in practice, which could be applied to our own project

Direction:

Lock-In Amplification was still the avenue we wanted to explore further, but we knew now with a greater certainty that our goals of having a phone app with our hardware were in-fact possible.

Mobile Phone App


Our final device needs to be operable by a farmer or vet in the field. We decided that the best way of achieving this was by creating a mobile phone app, which could receive images from the device, display them, and give a reading of the likelihood of a bTB infection. Two main routes were explored: either a phone could take the pictures directly by inserting the camera end into the device, or it could receive images from a microcontroller in the device, which would include its own camera, via Bluetooth.

All programming took place in Flutter, via Android Studio. While it proved temperamental, it was user-friendly, helpful to newcomers to the language (as we all were initially).

Plan A, using the phone's inbuilt camera was the simplest to achieve (and activated using the left button on the home screen). To make the app as versatile as possible, and adaptable to different phones, a list of available cameras is brought up for selection (currently for testing purposes 'camera 1' is automatically set, but this is simple to change). Once a photo is taken, it displays on the screen to allow a result to be recorded.

This worked well, but requiring the system to work with any phone presented issues. Any calibration of the device will be camera-specific, so the user would need to perform this before they could use the test - and worse, the device box would have to be built so that any phone could both fit into it and have its camera appropriately placed. Plan B. Building an app to work with Bluetooth complicated the jobs of both programmer and physical device builder, as the app would need to interface with many other devices, and our device would now need to contain its own controller, camera, and Bluetooth communicator. App-wise we can currently scan for devices, but this brings up an error as the variable holding the device name must be initialised (but there are no devices until the scan has taken place...). A simple communication page to allow the user to send signals to the device has been developed and is presented within the code below. The physical hardware also encountered issues - the camera was found to be ineffective for fluorescent concentrations below 100 nM. A future extension to this project could work in improving this.

The Android Studio program: a list of files in use are in a pane on the left, a similar pane on the right shows a visualisation of a mobile phone, and in the centre is the code itself.

Figure 5: The Android Studio program: a list of files in use are in a pane on the left, a similar pane on the right shows a visualisation of a mobile phone, and in the centre is the code itself.

App Prototype
The homepage of our app. Below a top bar giving the name and version number (v0.3 - Jersey update) is our snazzy logo, and underneath this are two buttons marked 'camera' and 'bluetooth'.

Figure 6:The homepage of our app. Below a top bar giving the name and version number (v0.3 - Jersey update) is our snazzy logo, and underneath this are two buttons marked 'camera' and 'bluetooth'.

The 'camera' window - this shows a live feed from the phone camera, and has a button at the bottom to take a photo.

Figure 7: The 'camera' window - this shows a live feed from the phone camera, and has a button at the bottom to take a photo.

Once the picture has been taken, the app goes to a new page, displaying the image which was just taken.

Figure 8: Once the picture has been taken, the app goes to a new page, displaying the image which was just taken.

Full code
main.dart

import 'package:flutter/material.dart';
import 'package:cowtest/home.dart';
import 'package:cowtest/cowtest.dart';
import 'package:cowtest/Camera.dart';
import 'package:cowtest/loading.dart';
import 'package:camera/camera.dart';
import 'package:cowtest/btinitial.dart';

Future main() async {

    // Ensure that plugin services are initialized so that `availableCameras()`
    // can be called before `runApp()`
    WidgetsFlutterBinding.ensureInitialized();

    // Obtain a list of the available cameras on the device.
    final cameras = await availableCameras();

    // Get a specific camera from the list of available cameras.
    final firstCamera = cameras.first;

    runApp(MaterialApp(
      initialRoute: '/home',
      routes: {
      '/': (context) => const Loading(),
      '/home': (context) => const Home(),
      '/cowtest': (context) => const CowTest(),
      '/Camera': (context) => TakePictureScreen(camera: firstCamera),
      //'/bluetooth': (context) => const Bluetooth(),
      '/btinitial': (context) => const BTInitial(),
      //'/btconnect': (context) => const BTConnect(),
      },
    ));
}

loading.dart

import 'package:flutter/material.dart';
class Loading extends StatefulWidget {

    const Loading({super.key});
    @override
    _LoadingState createState() => _LoadingState();
}
class _LoadingState extends State {
    @override
    Widget build(BuildContext context) {
      return const Scaffold(
        body: SafeArea(child: Text('cow test'))
      );
    }
}

home.dart

import 'package:flutter/material.dart'; class Home extends StatefulWidget {

    const Home({super.key});
    @override
    _HomeState createState() => _HomeState();
} class _HomeState extends State {
    @override
    Widget build(BuildContext context) {
      return Scaffold(
        backgroundColor: Colors.green[200],
        appBar: AppBar(
          automaticallyImplyLeading: false,
          backgroundColor: Colors.green[500],
          title: const Text('Cow App (alpha v0.3)'),
        ),
        body: Column(
          mainAxisAlignment: MainAxisAlignment.spaceEvenly,
            children: [
              Container(
                color:Colors.green[300],
                child: const Center(
                  child: Text(
                    "'Jersey update' (Bluetooth connection)",
                  ),
                ),
              ),
              Container(
                child:Image.asset('resources/logo.png'),
              ),
              Container(
                color:Colors.green[300],
                child: const Center(
                  //child: Text(
                  // "Next on to do list: get hold of an Arduino and begin physical testing."
                  //),
                ),
              ),
              Container(
                child:Image.asset('resources/cow.gif'),
              ),
              Row(
                mainAxisAlignment: MainAxisAlignment.center,
                children: [
                  Container(
                    padding:const EdgeInsets.all(25.0),
                    child: FloatingActionButton(
                      onPressed: () {
                        Navigator.pushNamed(context, '/Camera');
                      },
                      child: const Icon(Icons.camera_alt)
                  )
                  ),
                  Container(
                    padding:const EdgeInsets.all(25.0),
                    child: FloatingActionButton(
                      onPressed: () {
                        Navigator.pushNamed(context, '/btinitial');
                        },
                        backgroundColor: Colors.brown[500],
                        child: const Icon(Icons.bluetooth,
                        color: Colors.green,
                        ),
                      ),
                    ),
                    ],
                ),
              ]
          )
        );
      }
    }

Camera.dart

import 'package:camera/camera.dart';
import 'package:flutter/material.dart';
import 'dart:async';
import 'dart:io';
// A screen that allows users to take a picture using a given camera.
class TakePictureScreen extends StatefulWidget {

    const TakePictureScreen({
      super.key,
      required this.camera,
    });
    final CameraDescription camera;
    @override
    TakePictureScreenState createState() => TakePictureScreenState();
}
class TakePictureScreenState extends State {
    late CameraController _controller;
    late Future _initializeControllerFuture;
    @override
    void initState() {
      super.initState();
      // To display the current output from the Camera,
      // create a CameraController.
      _controller = CameraController(
        // Get a specific camera from the list of available cameras.
        widget.camera,
        // Define the resolution to use.
        ResolutionPreset.medium,
      );
    // Next, initialize the controller. This returns a Future.
    _initializeControllerFuture = _controller.initialize();
}
@override
void dispose() {
    // Dispose of the controller when the widget is disposed.
    _controller.dispose();
    super.dispose();
}
@override
Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('Cow Camera'),
      backgroundColor: Colors.green[1000],),
      // You must wait until the controller is initialized before displaying the
      // camera preview. Use a FutureBuilder to display a loading spinner until the
      // controller has finished initializing.
      body: FutureBuilder(
        future: _initializeControllerFuture,
        builder: (context, snapshot) {
          if (snapshot.connectionState == ConnectionState.done) {
            // If the Future is complete, display the preview.
            return CameraPreview(_controller);
          } else {
            // Otherwise, display a loading indicator.
            return const Center(child: CircularProgressIndicator());
          }
        },
      ),
      floatingActionButton: FloatingActionButton(
    // Provide an onPressed callback.
    onPressed: () async {
      // Take the Picture in a try / catch block. If anything goes wrong,
      // catch the error.
      try {
        // Ensure that the camera is initialized.
        await _initializeControllerFuture;
        // Attempt to take a picture and get the file `image`
        // where it was saved.
        final image = await _controller.takePicture();
        if (!context.mounted) return;
        // If the picture was taken, display it on a new screen.
        await Navigator.of(context).push(
          MaterialPageRoute(
            builder: (context) => DisplayPictureScreen(
              // Pass the automatically generated path to
              // the DisplayPictureScreen widget.
              imagePath: image.path,
            ),
          ),
        );
      } catch (e) {
        // If an error occurs, log the error to the console.
        print(e);
      }
    },
    child: const Icon(Icons.camera_alt),
), ); } }
// A widget that displays the picture taken by the user.
class DisplayPictureScreen extends StatelessWidget {
    final String imagePath;
    const DisplayPictureScreen({super.key, required this.imagePath});
    @override
    Widget build(BuildContext context) {
      return Scaffold(
        appBar: AppBar(title: const Text('Image')),
        // The image is stored as a file on the device. Use the `Image.file`
        // constructor with the given path to display the image.
        body: Image.file(File(imagePath)),
      );
    }
}

cowtest.dart

import 'package:flutter/material.dart';
class CowTest extends StatefulWidget {

    const CowTest({super.key});
    @override
    _CowTestState createState() => _CowTestState();
}
class _CowTestState extends State {
    @override
    Widget build(BuildContext context) {
      return Scaffold(
        appBar: AppBar(
          backgroundColor: Colors.green[500],
          title: const Text('Cow App (alpha v0.3)'),
        ),
        body: const SafeArea(child: Text('cow test'))
      );
    }
}

bluetooth.dart

import 'dart:async';
import 'dart:convert';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter_bluetooth_serial/flutter_bluetooth_serial.dart';
class ChatPage extends StatefulWidget {

    final BluetoothDevice server;
    const ChatPage({super.key, required this.server});
    @override
    _ChatPage createState() => _ChatPage();
      }
      class _Message {
        int whom;
        String text;
        _Message(this.whom, this.text);
      }
      class _ChatPage extends State {
        static const clientID = 0;
        late BluetoothConnection connection;
        List<_Message> messages = <_Message>[];
        String _messageBuffer = '';
        final TextEditingController textEditingController =
        TextEditingController();
        final ScrollController listScrollController = ScrollController();
        bool isConnecting = true;
        bool get isConnected => connection.isConnected;
        bool isDisconnecting = false;
        @override
        void initState() {
          super.initState();
          BluetoothConnection.toAddress(widget.server.address).then((connection) {
            print('Connected to the device');
            connection = connection;
            setState(() {
              isConnecting = false;
              isDisconnecting = false;
            });
            connection.input?.listen(_onDataReceived).onDone(() {
              // Example: Detect which side closed the connection
              // There should be `isDisconnecting` flag to show are we are (locally)
              // in middle of disconnecting process, should be set before calling
              // `dispose`, `finish` or `close`, which all causes to disconnect.
              // If we except the disconnection, `onDone` should be fired as result.
              // If we didn't except this (no flag set), it means closing by remote.
              if (isDisconnecting) {
                print('Disconnecting locally!');
              } else {
                print('Disconnected remotely!');
              }
              if (mounted) {
                setState(() {});
              }
            });
          }).catchError((error) {
            print('Cannot connect, exception occured');
            print(error);
          });
        }
        @override
        void dispose() {
          // Avoid memory leak (`setState` after dispose) and disconnect
          if (isConnected) {
            isDisconnecting = true;
            connection.dispose();
            //connection = null; (I commented this out as it was breaking the code, not sure if it will work now...
          }
          super.dispose();
        }
        @override
        Widget build(BuildContext context) {
          final List list = messages.map((message) {
            return Row(
              mainAxisAlignment: message.whom == clientID
                ? MainAxisAlignment.end
                : MainAxisAlignment.start,
              children: [
                Container(
                  padding: const EdgeInsets.all(12.0),
                  margin: const EdgeInsets.only(bottom: 8.0, left: 8.0, right: 8.0),
                  width: 222.0,
                  decoration: BoxDecoration(
                    color:
                    message.whom == clientID ? Colors.blueAccent : Colors.grey,
                    borderRadius: BorderRadius.circular(7.0)),
                  child: Text(
                    (text) {
                    return text == '/shrug' ? '¯\\_(ツ)_/¯' : text;
                    }(message.text.trim()),
                    style: const TextStyle(color: Colors.white)),
                ),
              ],
            );
          }).toList();
          return Scaffold(
            appBar: AppBar(
              title: (isConnecting
                ? const Text('Connecting chat...')
                : isConnected
                ? const Text('Live chat')
                : const Text('Chat log'))),
            body: SafeArea(
              child: Column(
                children: [
                  Container(
                    padding: const EdgeInsets.all(5),
                    width: double.infinity,
                    child: FittedBox(
                      child: Row(
                        children: [
                          FloatingActionButton(
                            onPressed: isConnected ? () => _sendMessage('1') : null, child: ClipOval(child: Image.asset('images/ledOn.png')),
                          ),
                          FloatingActionButton(
                            onPressed: isConnected ? () => _sendMessage('0') : null,
                            child: ClipOval(child: Image.asset('images/ledOff.png')),
                          ),
                        ],
                      ),
                    ),
                  ),
                  Flexible(
                    child: ListView(
                    padding: const EdgeInsets.all(12.0),
                    controller: listScrollController,
                    children: list),
                  ),
                  Row(
                    children: [
                      Flexible(
                        child: Container(
                          margin: const EdgeInsets.only(left: 16.0),
                          child: TextField(
                            style: const TextStyle(fontSize: 15.0),
                            controller: textEditingController,
                            decoration: InputDecoration.collapsed(
                              hintText: isConnecting
                                ? 'Wait until connected...'
                                : isConnected
                                ? 'Type your message...'
                                : 'Chat got disconnected',
                              hintStyle: const TextStyle(color: Colors.grey),
                            ),
                            enabled: isConnected,
                          ),
                        ),
                      ),
                      Container(
                        margin: const EdgeInsets.all(8.0),
                        child: IconButton(
                          icon: const Icon(Icons.send),
                          onPressed: isConnected
                            ? () => _sendMessage(textEditingController.text) : null),
                      ),
                    ],
                  )
                ],
              ),
            ),
          );
        }
        void _onDataReceived(Uint8List data) {
          // Allocate buffer for parsed data
          int backspacesCounter = 0;
          for (var byte in data) {
            if (byte == 8 || byte == 127) {
              backspacesCounter++;
            }
          }
          Uint8List buffer = Uint8List(data.length - backspacesCounter);
          int bufferIndex = buffer.length;
          // Apply backspace control character
          backspacesCounter = 0;
          for (int i = data.length - 1; i >= 0; i--) {
            if (data[i] == 8 || data[i] == 127) {
              backspacesCounter++;
            } else {
              if (backspacesCounter > 0) {
                backspacesCounter--;
              } else {
                buffer[--bufferIndex] = data[i];
              }
            }
          }
          // Create message if there is new line character
          String dataString = String.fromCharCodes(buffer);
          int index = buffer.indexOf(13);
          if (~index != 0) {
            setState(() {
              messages.add(
                _Message(
                  1,
                  backspacesCounter > 0
                    ? _messageBuffer.substring(
                    0, _messageBuffer.length - backspacesCounter)
                    : _messageBuffer + dataString.substring(0, index),
                ),
              );
              _messageBuffer = dataString.substring(index);
            });
          } else {
            _messageBuffer = (backspacesCounter > 0
              ? _messageBuffer.substring(
              0, _messageBuffer.length - backspacesCounter)
              : _messageBuffer + dataString);
          }
        }
        void _sendMessage(String text) async {
          text = text.trim();
          textEditingController.clear();
          if (text.isNotEmpty) {
            try {
              connection.output.add(utf8.encode("$text\r\n"));
              await connection.output.allSent;
              setState(() {
                messages.add(_Message(clientID, text));
              });
              Future.delayed(const Duration(milliseconds: 333)).then((_) {
                listScrollController.animateTo(
                  listScrollController.position.maxScrollExtent,
                  duration: const Duration(milliseconds: 333),
                  curve: Curves.easeOut);
              });
            } catch (e) {
            // Ignore error, but notify state
            setState(() {}); }
          }
        }
      }

btconnect.dart

import 'package:flutter/material.dart'; import 'package:flutter_bluetooth_serial/flutter_bluetooth_serial.dart';
import 'dart:async';
import 'package:cowtest/BTDevice.dart';
class SelectBondedDevicePage extends StatefulWidget {

    /// If true, on page start there is performed discovery upon the bonded devices.
    /// Then, if they are not avaliable, they would be disabled from the selection.
    final bool checkAvailability;
    final Function onCahtPage;
    const SelectBondedDevicePage(
      {super.key, this.checkAvailability = true, required this.onCahtPage});
    @override
    _SelectBondedDevicePage createState() => _SelectBondedDevicePage();
    //TODO: this might also break stuff (line above)
}
enum _DeviceAvailability {
    no,
    maybe,
    yes,
}
class _DeviceWithAvailability extends BluetoothDevice {
    BluetoothDevice device;
    _DeviceAvailability availability;
    int rssi=0;
    _DeviceWithAvailability(
      this.device, this.availability) : super(address: '')
    ;
}
class _SelectBondedDevicePage extends State {
    List<_DeviceWithAvailability> devices = <_DeviceWithAvailability>[];
    // Availability
    StreamSubscription _discoveryStreamSubscription=0 as StreamSubscription;
    // TODO: fix this line, it breaks the whole thing!
    bool _isDiscovering=false;
    _SelectBondedDevicePage();
    @override
    void initState() {
      super.initState();
      _isDiscovering = widget.checkAvailability;
      if (_isDiscovering) {
        _startDiscovery();
      }
      // Setup a list of the bonded devices
      FlutterBluetoothSerial.instance
          .getBondedDevices() .then((List bondedDevices) {
        setState(() {
          devices = bondedDevices
            .map(
              (device) => _DeviceWithAvailability(
            device,
            widget.checkAvailability
              ? _DeviceAvailability.maybe
              : _DeviceAvailability.yes
            ),
          )
          .toList();
        });
      });
    }
    void _restartDiscovery() {
      setState(() {
        _isDiscovering = true;
      });
      _startDiscovery();
    }
    void _startDiscovery() {
      _discoveryStreamSubscription =
        FlutterBluetoothSerial.instance.startDiscovery().listen((r) {
          setState(() {
            Iterator i = devices.iterator;
            while (i.moveNext()) {
              var device = i.current;
              if (device.device == r.device) {
                device.availability = _DeviceAvailability.yes;
                device.rssi = r.rssi;
              }
            }
          });
        });
      _discoveryStreamSubscription.onDone(() {
        setState(() {
          _isDiscovering = false;
        });
      });
    }
    @override
    void dispose() {
      // Avoid memory leak (`setState` after dispose) and cancel discovery
      _discoveryStreamSubscription.cancel();
      super.dispose();
    }
    @override
    Widget build(BuildContext context) {
      List list = devices
        .map(
          (device) => BluetoothDeviceListEntry(
        device: device.device,
        // rssi: _device.rssi,
        // enabled: _device.availability == _DeviceAvailability.yes,
        onTap: () {
          widget.onCahtPage(device.device);
        },
      ), )
        .toList();
      return ListView(
        children: list,
      );
      // return Scaffold(
      // appBar: AppBar(
      // title: Text('Select device'),
      // actions: [
      // _isDiscovering
      // ? FittedBox(
      // child: Container(
      // margin: new EdgeInsets.all(16.0),
      // child: CircularProgressIndicator(
      // valueColor: AlwaysStoppedAnimation(
      // Colors.white,
      // ),
      // ),
      // ),
      // )
      // : IconButton(
      // icon: Icon(Icons.replay),
      // onPressed: _restartDiscovery,
      // )
      // ],
      // ),
      // body: ListView(children: list),
      // );

    }
}
// class BTConnect extends StatefulWidget {
// const BTConnect({super.key});
//
// @override
// _BTConnectState createState() => _BTConnectState();
// }
//
// class _BTConnectState extends State {
// @override
// Widget build(BuildContext context) {
// return Scaffold(
// appBar: AppBar(
// backgroundColor: Colors.green[500],
// title: const Text('Cow App (alpha v0.3)'),
// ),
// body: const SafeArea(child: Text('cow test'))
// );
// }
// }

BTDevice.dart

import 'package:flutter/material.dart';
import 'package:flutter_bluetooth_serial/flutter_bluetooth_serial.dart';
class BluetoothDeviceListEntry extends StatelessWidget {

    final Function onTap;
    final BluetoothDevice device;
    const BluetoothDeviceListEntry({super.key, required this.onTap, required this.device});
    @override
    Widget build(BuildContext context) {
      return ListTile(
        onTap: onTap(),
        leading: const Icon(Icons.devices),
        title: Text(device.name ?? "Unknown device"),
        subtitle: Text(device.address.toString()),
        trailing: FloatingActionButton(
          onPressed: onTap(),
          backgroundColor: Colors.blue,
          child: const Text('Connect'),
        ),
      );
    }
}

btinitial.dart

import 'package:flutter/material.dart';
import 'package:flutter_bluetooth_serial/flutter_bluetooth_serial.dart';
import 'package:cowtest/btconnect.dart';
import 'package:cowtest/bluetooth.dart';
class BTInitial extends StatefulWidget {

    const BTInitial({super.key});
    @override
    _BTInitialState createState() => _BTInitialState();
}
class _BTInitialState extends State {
    @override
    Widget build(BuildContext context) {
      FutureBuilder(
        future: FlutterBluetoothSerial.instance.requestEnable(),
        builder: (context, future) {
          if (future.connectionState == ConnectionState.waiting) {
            return const Scaffold(
              body: SizedBox(
              height: double.infinity,
              child: Center(
                child: Icon(
                Icons.bluetooth_disabled,
                size: 200.0,
                color: Colors.blue))));
          } else if (future.connectionState == ConnectionState.done) {
            return const BTHome();
          } else {
            return const BTHome(); // this might break the code, remove if it does!
          }
        },
      );
      return const BTHome();
    }
}
class BTHome extends StatelessWidget {
    const BTHome({super.key});
    @override
    Widget build(BuildContext context) {
      return SafeArea(
        child: Scaffold(
          appBar: AppBar(
            title: const Text("Connection"),
          ),
          body: SelectBondedDevicePage(
            onCahtPage: (device1) {
              BluetoothDevice device = device1;
              Navigator.push(
                context,
                MaterialPageRoute(
                  builder: (context) {
                    return ChatPage (server: device);
                  },
                ),
              );
            },
          ),
        ),
      );
    }
}
// return Scaffold(
// appBar: AppBar(
// backgroundColor: Colors.green[500],
// title: const Text('Cow App (alpha v0.3)'),
// ),
// body: const SafeArea(child: Text('cow test'))
// );
// }
//}



Hardware Iterations and Testing


Initial ideas of how the hardware test could function, were based around using phone cameras (more specifically an iPhone XR camera) to detect the fluorescence. The client could perform the test with our probes and then use a lens attachment on a phone to image the samples. Hardware experiments used dilutions of fluorescein, a fluorescent chemical, as a substitute for the RNA probe fluorophores because it has a similar excitation and emission wavelength. Phosphate buffered Saline (PBS) was used as the buffer for the different dilutions of fluorescein. Through experimentation, a visible fluorescence was only captured with a bandpass filter in front of the lens, down to a fluorescein concentration of 100 nM, however the results of the probes suggested that an equivalent fluorescein concentration of 40nM would need to be detected for our test to work.

PBS in a microtube

Figure 9:(left) PBS with no fluorescein in a microtube, whilst still in the presence of LED blue light. The reflected light on the side of the microtube is blue. (right) 100 nM Concentration of fluorescein. Green fluorescence is clearly visible when the sample is excited with the blue LED.

A silicon-based photodiode was also tested, however this led to results indistinguishable from background noise, only small changes in the overall potential difference by a few milli volts. This meant that a more sensitive form of photodetection was required, that also amplified the fluorescent signal. Other low-cost methods of photodetection were researched. Initially a miniature spectrometer was considered[3], however the relative cost of it was less than desirable. After discussions with photonics experts, they pointed us in the direction of lock-in amplification as a method.

To test the lock-in amplifier: Background noise was first measured by reading the light through 1 mL of PBS in a cuvette. This noise was then subtracted from subsequent fluorescent measurements taken. From testing, the fluorescent readings were indistinguishable from the background noise, even when using a positive control (10 μM fluorescein) there was no significant change in the relative fluorescence reading. Initially, the bandpass filter position was adjusted and there was no discernible change, the next troubleshooted issue was the photodiode itself. This was the main issue of the hardware repeated; Sensitivity, the photodiode we used had a peak intensity wavelength of 650 nm, this was not ideal, however we thought the bandwidth might have been wide enough to extend to 520 nm (the emission wavelength of fluorescein).


Potential Solutions to Hardware Issues


For future iGEM teams who want to integrate lock-in amplification into their project, here are a few solutions and adjustments to potentially try:

1) Choose an amplified photodiode with a central maximum close to the emission wavelength of your fluorophore by checking if it lies within 90% of the central maximum wavelength of the photodiode. Based on viewing photodiodes on RS®, there are not any 8-pin amplified photodiodes with a central wavelength close to 520 nm, so adjustments to the printed circuit board would have to be made.

2) Use a long and short pass filter instead of bandpass filter. Bandpass filters can only be blue shifted, which would typically mean more of the excitation photons from the LED will land on the photodiode. This would lead to an increase in background noise. A long pass filter in front of the photodiode would help remove unwanted photons from the LED, and a short pass in front of the LED would reduce the number of photons not involved in fluorophore excitation.

3) For use of the hardware with blood samples, the 3D printed casing could be changed to fit a Monovette® instead, as they work with the materials vets used.

Click for References

    [1] Harvie AJ, Yadav SK, de Mello JC. A sensitive and compact optical detector based on digital lock-in amplification. HardwareX. 2021 Sep 2;10:e00228. doi: 10.1016/j.ohx.2021.e00228. PMID: 35607666; PMCID: PMC9123480.

    [2]Power SM, Free L, Delgado A, Richards C, Alvarez-Gomez E, Ciprian Briciu-Burghina, et al. A novel low-cost plug-and-play multi-spectral LED based fluorometer, with application to chlorophyll detection. Analytical Methods. 2023 Jan 1;15(41):5474–82.

    [3] Lyons RG. Understanding digital signal processing. Upper Saddle River Etc.: Pearson Education International, Cop; 2013.

back to top button