I have a vaadin-combo-box in my Fusion v21 app which I try to populate with objects that contain a name and a value. So, I want to display the name property in the dropdown of the combo box and when I select a value, it should put the value in my entity which is bound by a Binder.
private binder = new Binder<SamplePerson, SamplePersonModel>(this, SamplePersonModel);
...
<vaadin-combo-box
.items="${[{name:'Name1', value:'Value1'},{name:'Name2', value:'Value2'}]}"
#value-changed="${(e: CustomEvent) => console.log(e.detail.value)}"
item-label-path="name"
item-value-path="value"
...="${field(this.binder.model.lastName)}">
</vaadin-combo-box>
My entity:
#Data
public class SamplePerson {
#Id
#GeneratedValue
#Nonnull
private Integer id;
#NotNull
private String lastName;
}
When I change the value, I can see in the console log, that it is displaying the correct value. But when I inspect my entity in the submitTo method, I get the following:
// expected: lastName: 'Value1'
// but got:
lastName: {name: 'Name1', value: 'Value1'}
Am I doing something wrong here?
(I have refactored a start app from Vaadin using the Person form template.)
Related
I am building a table in React front end , I have and array list filled with id's.I want to populate the table by only using the Id's.
The Ids are filled in my customer collection
public class Customer {
#Id
private String id;
private String username;
private List<LinkedUsersID>linkedUsersId;
public class User{
#Id
private String id;
private String name;
private String surname;
private List<LinkedUsersID>linkedUsersId;
Someone metioned to me that I can reffrence the ids to a mongo collection and the data will populate itself in the table. Use Id to refer to the user collection and get data with that specific id and populate the Table
Can someone please explain or share a link on how this works.
use $lookup in aggregation like this
{
$lookup:{
from:"user" ,// user collection
localField:"linkedUsersId",
foreignField:"id",
as:"newUserField"
}
}
now run this aggregate on customer collection and returns user data in newUserField
I have a datagridview that was bound to a generic List<> of my objects. Everything works fine. I then changed the List to a SortableBindingList so that I can sort the columns. This works fine to except now I get an exception when I try to add a row. The exception is:
"Operation is not valid due to the current state of the object."
This occurs in the WinForms runtime in DatagridView.DataGridviewDataConnection.ProcessListChanged method.
Anyone have any ideas what the problem might be?
So you have separated your data from how it is displayed. Good for you! Too often I see that people try to fiddle with cells and rows instead of using the datasource.
However, if your DataSource is a List<...>, the changes that the operator makes to the DatagridView are not reflected in the DataSource.
If you want that items that are added / removed / changed by the operator are also changed in your DataSource, you should use an object that implements IBindingList, like (surprise!) BindingList
You forgot to tell us what you show in your DataGridView, let's assume you show Products
class Product
{
public int Id {get; set;}
public string Name {get; set;}
public decimal Price {get; set;}
public int StockCount {get; set;}
public string LocationCode {get; set;}
...
}
Using visual studio designer you've added the columns that you want to show. You'll have to tell which column shows which property. For instance in the constructor:
public MainForm()
{
InitializeComponents();
// define which columns show which properties
columnId.DataPropertyName = nameof(Product.Id);
columnName.DataPropertyName = nameof(Product.Name);
columnPrice.DataPropertyName = nameof(Product.Price);
...
To make access to the displayed products easy, add a Property:
private BindingList<Product> DisplayedProducts
{
get => (BindingList<Product>)this.dataGridView1.DataSource,
set => this.dataGridView1.DataSource = value;
}
Initialization:
private void ShowProducts()
{
IEnumerable<Product> queryProducts = this.FetchProducts();
this.DisplayedProducts = new BindingList<Product>(queryProducts.ToList());
}
Now whenever the operator adds / removes a row, or edits the cells in a row, the BindingList is automatically updated, even if the columns are rearranged, or the data sorted.
If the operator indicates that he finished editing the data, for instance by pressing a button, you immediately have all updated information:
private void OnButtonOk_Clicked(object sender, ...)
{
ICollection<Product> displayedProducts = this.DisplayedProducts;
// find out what is added / removed / changed
this.ProcessEditedProducts(displayedProducts);
}
If you need to access selected items, consider to add the following properties:
private Product CurrentProduct => (Product)this.dataGridView1.CurrentRow?.DataBoundItem;
private IEnumerable<Product> SelectedProducts => this.dataGridView1.SelectedRows
.Cast<DataGridViewRow>()
.Select(row => row.DataBoundItem)
.Cast<Product>();
If the operator adds a row, he starts with a row that is constructed using the default constructor. If you don't want this, but for instance initialize the Id, or the LocationCode, consider to subscribe to event BindingList AddingNew
this.DisplayedProducts.AddingNew += OnAddingNewProduct;
void OnAddingNewProduct(object sender, AddingNewEventArgs e)
{
e.NewObject = new Product
{
Id = this.GenerateProductId(),
Name = "Please enter product name>",
LocationCode = "Unknown Location",
...
};
}
I want to save both child and parent entities whenever a POST call is made. I have an Item entity with a one to one mapping to a parent table Attribute:
#Entity
#Table(name="Item")
#JsonIgnoreProperties(ignoreUnknown = true)
public class Item
{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name="id")
private Long id;
#OneToOne(fetch=FetchType.LAZY)
#JoinColumn(name="attr_id")
private Attribute attribute;
#OneToMany(mappedBy = "item", cascade = CascadeType.ALL, orphanRemoval=true)
private List<ValueSet> valueSets = new ArrayList<>();
// Other fields, getters, setters, overriding equals and hashcode using Objects.equals() and Objects.hashCode() for all the fields, helper for adding and removing ValueSet
}
The Attribute entity looks like this:
#Entity
#Table(name="Attribute")
#JsonIgnoreProperties(ignoreUnknown = true)
public class Attribute
{
#Id
#Column(name="id")
private Long id;
// Other fields, getters, setters, NOT overriding equals hashCode
}
Whenever an Item gets saved I need the Attribute to get saved as well. I've my postman sending JSON POST data as follows:
{
"attribute":{
"id":"6"
},
"valueSets":[
{
"value":"basic"
}
]
}
My handler looks like this:
#PostMapping("/item")
public void postItems(#RequestBody Item item)
{
itemRepository.save(item);
}
ItemRepository is just a one liner with #Repository annotation:
public interface ItemRepository extends CrudRepository<Item, Long>
When I try to save the Item I run into - Cannot insert the value NULL into column 'attr_id', table 'Item'; column does not allow nulls. INSERT fails.
I can't figure out why is it unable to take the id value of 6 that I am supplying as part of my POST invocation. The id value 6 already exists on the Attribute table. I have also tried making the relationship bi-directional using mappedBy and CASCADE.ALL but still get the same error.
Any thoughts/suggestions on what I'm messing/missing? Also, is there a better approach to handle nested entities? Should I try to save each of them individually? In that case can the #RequestBody still be the parent entity?
I have built an example project, and try to replicate your situation, not successful. I am able to insert the "item" without any issue.
I placed the project under this repository https://github.com/hepoiko/user-5483731-1
Hope this help you to troubleshooting further or let me know If I miss anything in there.
I want to read from DB and create a CSV file. In order to do that I am using camel-jdbc and camel-bindy.
First I set the body with the SELECT statement.
SELECT [vendor],
[ean],
[itemid] AS itemId,
[quantity]
FROM [dbo].[ElectronicDeliveryNotes]
then I call the jdbc component
<to uri="jdbc:dataSource?outputType=SelectList&outputClass=com.xxx.Model"/>
This will return a List of Model. Model class is
#CsvRecord(separator = ";", generateHeaderColumns = true, crlf = "UNIX")
public class Model2 {
#DataField(pos = 1, columnName = "A_Liererant")
private String vendor;
#DataField(pos = 2, columnName = "F_EAN")
private String ean;
#DataField(pos = 3, columnName = "G_Lief. Artikelnummer")
private String itemId;
#DataField(pos = 4, columnName = "H_Menge")
private BigDecimal quantity;
//getters setters
I am getting the following error:
java.lang.IllegalArgumentException: Cannot map all properties to bean of type class com.xxx.Model2. There are 1 unmapped properties. {itemid=11.0441-5402.2}
From my understanding the problem is in the naming of model properties. One solution that I tried and worked is to rename the Model itemId => itemid. This will work, but i am not using Java naming conventions.
Do you know how to overcome this without renaming properties?
I also tried the following but it didn't work.
#DataField(pos = 3, columnName = "G_Lief. Artikelnummer", name = "itemid")
private String itemId;
I didn't see anything wrong with your code structure.
If what you want to accomplish is retrieving from a table and export its results to a CSV based on your Model2 class, I may suggest using camel-sql. Could be something like:
#Override
protected RoutesBuilder createRouteBuilder() throws Exception {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
getContext().getComponent("sql", SqlComponent.class).setDataSource(db);
BindyCsvDataFormat camelDataFormat = new BindyCsvDataFormat(Model2.class);
from("sql:select vendor, ean, itemid as itemId, quantity from ElectronicDeliveryNotes?outputType=SelectList&outputClass=com....model.Model2")
.marshal(camelDataFormat)
.log("the body:\n${body}")
.to("mock:result");
}
};
}
You poll the data from the table, marshall it and then send the message to other queue. I've run some tests to make sure that the query results could be transformed into a CSV and as long as you keep the fields' name equals to your property, nothing seems to go wrong. (side note: in my tests even without the alias everything went fine).
But, testing your code, I went in the same error. Maybe you need to implement the beanRowMapper:
To use a custom org.apache.camel.component.jdbc.BeanRowMapper when using outputClass. The default implementation will lower case the row names and skip underscores, and dashes. For example "CUST_ID" is mapped as "custId".
My guess this is the reason why you are stuck in this error:
java.lang.IllegalArgumentException: Cannot map all properties to bean of type class com.xxx.Model2. There are 1 unmapped properties. {itemid=11.0441-5402.2}
Try renaming your alias to ITEM_ID.
I'm trying to filter with Objectify in GAE:
List<users> ul = ofy.load().type(Usuario.class).filter("name", "gildus").list();
In the User's class use anotation #Index:
#Entity
public class Users {
#Id
private Long id;
#Index
private String name;
...
The filter result is empty, although there is value "gildus". When I use the ID field if it shows results (....filter("id", "1").list() ).
What more could I do to make it work ?
When I use the ID field if it shows results (....filter("id", "1").list() ).
Don't use filter for id. Use the following instead:
Usuario user = ofy.load().type(Usuario.class).id(1).get();